0
# input_shape = (137861, 21, 1)
# output_sequence_length = 21
# english_vocab_size = 199
# french_vocab_size = 344

def embed_model(input_shape, output_sequence_length, english_vocab_size, french_vocab_size):
    '''
    Build and train a RNN model using word embedding on x and y
    :param input_shape: Tuple of input shape
    :param output_sequence_length: Length of output sequence
    :param english_vocab_size: Number of unique English words in the dataset
    :param french_vocab_size: Number of unique French words in the dataset
    :return: Keras model built, but not trained
    '''

    learning_rate = 1e-3
    model = Sequential()

    model.add(Embedding(english_vocab_size, 128, input_length=output_sequence_length, input_shape=input_shape[1:]))

    model.add(GRU(units=128, return_sequences=True))
    model.add(TimeDistributed(Dense(french_vocab_size)))
    model.add(Activation('softmax'))

    model.summary()

    model.compile(loss=sparse_categorical_crossentropy,
                  optimizer=Adam(learning_rate),
                  metrics=['accuracy'])

    return model

When invoking this method to train a model, it gets the error:

ValueError: Input 0 is incompatible with layer gru_1: expected ndim=3, found ndim=4

How to fix the shape error between Embedding Layer and GRU Layer?

Milo Lu
  • 3,176
  • 3
  • 35
  • 46
user3272089
  • 51
  • 1
  • 5
  • If the answer resolved your issue, kindly *accept* it by clicking on the checkmark next to the answer to mark it as "answered" - see [What should I do when someone answers my question?](https://stackoverflow.com/help/someone-answers) – today Nov 15 '18 at 09:06

1 Answers1

0

The problem is that the Embedding layer takes a 2D array as the input. However, the shape of the input array is (137861, 21, 1) which makes it a 3D array. Simply remove the last axis using squeeze() method from numpy:

data = np.squeeze(data, axis=-1)

As a side, there is no need to use TimeDistributed layer here, since the Dense layer is applied on the last axis by defualt.

today
  • 32,602
  • 8
  • 95
  • 115