Congratulations!

Modelli Transformer con PyTorch

James Chapman

Curriculum Manager, DataCamp

Chapter 1

model = nn.Transformer(
    d_model=1536,
    nhead=8,
    num_encoder_layers=6,
    num_decoder_layers=6
)
class InputEmbeddings(nn.Module): ...
class PositionalEncoding(nn.Module): ...
class MultiHeadAttention(nn.Module): ...

The transformers architecture as shown in the academic paper, Attention Is All You Need.

Modelli Transformer con PyTorch

Encoder-only transformer

Encoder-only transformer architecture

Decoder-only transformer

Decoder-only transformer architecture

Modelli Transformer con PyTorch

Chapter 2 - Encoder-decoder transformer

Original transformer architecture

Modelli Transformer con PyTorch

What next?

Modelli Transformer con PyTorch

Let's practice!

Modelli Transformer con PyTorch

Preparing Video For Download...