Congratulations!
Recurrent Neural Networks (RNNs) for Language Modeling with Keras
David Cecchini
Data Scientist
Wrap-up
Introduction to language tasks:
Sentiment classification
Multi-class classification
Text Generation
Neural Machine Translation
Sequence to sequence models
Implementation in Keras
RNN pitfalls and different cell types
Vanishing and exploding gradient problems
GRU and LSTM cells
Word vectors and the Embedding layer
Better sentiment analysis
Multi-class classification
Data preparation
Transfer learning
Keras models
Model performance
Text generation and NMT
Text Generation
Chars as token
Data preparation
Generate sentences mimicking Sheldon
Neural Machine Translation
Words as tokens
Data preparation: encoders and decoders
Translate Portuguese to English
Congratulations!!!
Recurrent Neural Networks (RNNs) for Language Modeling with Keras
Preparing Video For Download...