Introduction to Deep Learning with PyTorch
Jasmin Ludolf
Senior Data Science Content Developer, DataCamp
$$
Can we solve the problem?
Set a performance baseline
$$
$$ $$



Modify the training loop to overfit a single data point
features, labels = next(iter(dataloader))
for i in range(1000):
outputs = model(features)
loss = criterion(outputs, labels)
optimizer.zero_grad()
loss.backward()
optimizer.step()
Then scale up to the entire training set
Goal: maximize the validation accuracy
Experiment with:
$$

$$
Original model overfitting training data

$$
Updated model with too much regularization

for factor in range(2, 6):
lr = 10 ** -factor

factor = np.random.uniform(2, 6)
lr = 10 ** -factor

Introduction to Deep Learning with PyTorch