The goal is to transform each index to a embedded vector and then average all vectors to a single one. I must to ignore padded zeros!
The averaged vector should be transferred to the next layers.
This is my code :
from keras.layers import Embedding,Input,AveragePooling1D
from keras.models import Model
from keras.preprocessing.text import Tokenizer as Keras_Tokenizer
from keras.preprocessing.sequence import pad_sequences
import numpy as np
embedding_size = 4
vocab_size = 9 + 1
max_sequence_length = 5
my_input = Input(shape=(max_sequence_length,), name='input')
embedding_layer = Embedding(output_dim=embedding_size, input_dim=vocab_size,input_length=1, mask_zero=True,name='my_embedding')
embedding = embedding_layer(my_input)
avg=AveragePooling1D(pool_size=max_sequence_length)(embedding)#Calc average off all embedding vectors
model = Model(inputs= [my_input], outputs=avg)
model.get_weights()
aa = np.array([[0,0,2,4]])#sanity checks
model.predict(aa)[0][0]
And getting this error :
TypeError: Layer average_pooling1d_1 does not support masking, but was passed an input_mask: Tensor("my_embedding_9/NotEqual:0", shape=(?, 5), dtype=bool)
Any one can assist?