Evaluation of multi-output models and loss weighting

Intermediate Deep Learning with PyTorch

Michal Oleszak

Machine Learning Engineer

Model evaluation

acc_alpha = Accuracy(
    task="multiclass", num_classes=30
)
acc_char = Accuracy(
    task="multiclass", num_classes=964
)


net.eval() with torch.no_grad(): for images, labels_alpha, labels_char \ in dataloader_test: out_alpha, out_char = net(images)
_, pred_alpha = torch.max(out_alpha, 1) _, pred_char = torch.max(out_char, 1)
acc_alpha(pred_alpha, labels_alpha) acc_char(pred_char, labels_char)
  • Set up metric for each output
  • Iterate over test loader and get outputs
  • Calculate prediction for each output
  • Update accuracy metrics
  • Calculate final accuracy scores
print(f"Alphabet: {acc_alpha.compute()}")
print(f"Character: {acc_char.compute()}")
Alphabet: 0.3166305720806122
Character: 0.24064336717128754
Intermediate Deep Learning with PyTorch

Multi-output training loop revisited

for epoch in range(10):
    for images, labels_alpha, labels_char \
    in dataloader_train:
        optimizer.zero_grad()
        outputs_alpha, outputs_char = net(images)
        loss_alpha = criterion(
          outputs_alpha, labels_alpha
        )
        loss_char = criterion(
          outputs_char, labels_char
        )
        loss = loss_alpha + loss_char
        loss.backward()
        optimizer.step()
  • Two losses: for alphabets and characters
  • Final loss defined as sum of alphabet and character losses: loss = loss_alpha + loss_char
  • Both classification tasks deemed equally important
Intermediate Deep Learning with PyTorch

Varying task importance

Character classification 2 times more important than alphabet classification

  • Approach 1: Scale more important loss by a factor of 2

    loss = loss_alpha + loss_char * 2
    
  • Approach 2: Assign weights that sum to 1

    loss = 0.33 * loss_alpha + 0.67 * loss_char
    
Intermediate Deep Learning with PyTorch

Warning: losses on different scales

  • Losses must be on the same scale before they are weighted and added
  • Example tasks:

    • Predict house price -> MSE loss
    • Predict quality: low, medium, high -> CrossEntropy loss
  • CrossEntropy is typically in the single-digits

  • MSE loss can reach tens of thousands
  • Model would ignore quality assessment task
  • Solution: Normalize both losses before weighting and adding
    loss_price = loss_price / torch.max(loss_price)
    loss_quality = loss_quality / torch.max(loss_quality)
    loss = 0.7 * loss_price + 0.3 * loss_quality
    
Intermediate Deep Learning with PyTorch

Let's practice!

Intermediate Deep Learning with PyTorch

Preparing Video For Download...