Intermediate Deep Learning with PyTorch
Michal Oleszak
Machine Learning Engineer
acc_alpha = Accuracy( task="multiclass", num_classes=30 ) acc_char = Accuracy( task="multiclass", num_classes=964 )
net.eval() with torch.no_grad(): for images, labels_alpha, labels_char \ in dataloader_test: out_alpha, out_char = net(images)
_, pred_alpha = torch.max(out_alpha, 1) _, pred_char = torch.max(out_char, 1)
acc_alpha(pred_alpha, labels_alpha) acc_char(pred_char, labels_char)
print(f"Alphabet: {acc_alpha.compute()}")
print(f"Character: {acc_char.compute()}")
Alphabet: 0.3166305720806122
Character: 0.24064336717128754
for epoch in range(10):
for images, labels_alpha, labels_char \
in dataloader_train:
optimizer.zero_grad()
outputs_alpha, outputs_char = net(images)
loss_alpha = criterion(
outputs_alpha, labels_alpha
)
loss_char = criterion(
outputs_char, labels_char
)
loss = loss_alpha + loss_char
loss.backward()
optimizer.step()
loss = loss_alpha + loss_char
Character classification 2 times more important than alphabet classification
Approach 1: Scale more important loss by a factor of 2
loss = loss_alpha + loss_char * 2
Approach 2: Assign weights that sum to 1
loss = 0.33 * loss_alpha + 0.67 * loss_char
Example tasks:
CrossEntropy is typically in the single-digits
loss_price = loss_price / torch.max(loss_price)
loss_quality = loss_quality / torch.max(loss_quality)
loss = 0.7 * loss_price + 0.3 * loss_quality
Intermediate Deep Learning with PyTorch