Hey guys I have built an LSTM model that works and now I am trying(unsuccessfully) to add an Embedding layer as a first layer.
This solution didn't work for me. I also read these questions before asking: Keras input explanation: input_shape, units, batch_size, dim, etc, Understanding Keras LSTMs and keras examples.
My input is a one-hot encoding(of ones and zeros) of characters of a language that consists 27 letters. I chose to represent each word as a sequence of 10 characters. Input size for each word is (10,27)
and I have 465 of them so it's X_train.shape (465,10,27)
, I also have a label of size y_train.shape (465,1)
. My goal is to train a model and while doing that to build a character embeddings.
Now this is the model that compiles and fits.
main_input = Input(shape=(10, 27))
rnn = Bidirectional(LSTM(5))
x = rnn(main_input)
de = Dense(1, activation='sigmoid')(x)
model = Model(inputs = main_input, outputs = de)
model.compile(loss='binary_crossentropy',optimizer='adam')
model.fit(X_train, y_train, epochs=10, batch_size=1, verbose=1)
After adding Embedding layer:
main_input = Input(shape=(10, 27))
emb = Embedding(input_dim=2, output_dim = 10)(main_input)
rnn = Bidirectional(LSTM(5))
x = rnn(emb)
de = Dense(1, activation='sigmoid')(x)
model = Model(inputs = main_input, outputs = de)
model.compile(loss='binary_crossentropy',optimizer='adam')
model.fit(X_train, y_train, epochs=10, batch_size=1, verbose=1)
output: ValueError: Input 0 is incompatible with layer bidirectional_31: expected ndim=3, found ndim=4
How do I fix the output shape? Your ideas would be much appreciated.