Wrap-up and the final showdown

Machine Translation with Keras

Thushan Ganegedara

Data Scientist and Author

What you've done so far

  • Chapter 1
    • Introduction to encoder-decoder architecture
    • Understanding GRU layer
  • Chapter 2
    • Implementing the encoder
    • Implementing the decoder
    • Implementing the decoder prediction layer
Machine Translation with Keras

What you've done so far

  • Chapter 3
    • Preprocessing data
    • Training the machine translation model
    • Generating translations
  • Chapter 4
    • Introduction to teacher forcing
    • Training a model with teacher forcing
    • Generating translations
    • Using word embeddings for machine translation
Machine Translation with Keras

Machine transation models

  • Model 1
    • The encoder consumes English words (onehot encoded) and outputs a context vector
    • The decoder consumes the context vector and outputs the translation
  • Model 2
    • The encoder consumes English words (onehot encoded) and outputs a context vector
    • The decoder consumes a given word (onehot encoded) of the translation and predicts the next word
  • Model 3
    • Instead of onehot encoding, uses word vectors
    • Word vectors capture the semantic relationship between words
Machine Translation with Keras

Performance of different models

Machine Translation with Keras

Latest developments and further reading

Machine Translation with Keras

All the best!

Machine Translation with Keras

Preparing Video For Download...