Recurrent Neural Networks (RNNs) for Language Modeling with Keras
David Cecchini
Data Scientist
The Text Generation Model:
categorical_crossentropy
as loss functionmodel = Sequential()
model.add(LSTM(units, input_shape=(chars_window, n_vocab), dropout=0.15, recurrent_dropout=0.15, return_sequences=True))
model.add(LSTM(units, dropout=dropout, recurrent_dropout=0.15, return_sequences=False))
model.add(Dense(n_vocab, activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam')
Difference to classification:
Recurrent Neural Networks (RNNs) for Language Modeling with Keras