-
Notifications
You must be signed in to change notification settings - Fork 6
Roadmap
Miguel Amigot edited this page Apr 13, 2016
·
5 revisions
Use the Keras library initially (potentially move to TensorFlow later on). The following sample code comes from "Predicting sequences of vectors (regression) in Keras using RNN - LSTM":
from keras.models import Sequential
from keras.layers.core import Dense, Activation, Droupout
from keras.layers.recurrent import LSTM
model = Sequential()
model.add(LSTM(5, 300, return_sequences=True))
model.add(LSTM(300, 500, return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(500, 200, return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(200, 3))
model.add(Activation("linear"))
# Use our own loss function here instead of "mean_squared_error", etc. (see below)
model.compile(loss="mean_squared_error", optimizer="rmsprop")
We need to choose a custom loss function. Pass a custom loss function with a prototype like these to model.compile(...)
.
We won't need to compute the derivative of the function since Keras will do it automatically.
That sample code does not account for the full architecture of the LSTM.