Let's say my data has 25 features. In Keras, I could easily implement a Embedding
layer for each input feature and merge them together to feed to later layers.
I see that tf.nn.embedding_lookup
accepts a id
parameter which could be just a plain integer or a array of integers( [1,2,3,..] ). However, the feature input is often of the shape
x = tf.placeholder(tf.float32, [None, in_feature_num])
I could split the feature to their own by using
X = tf.split(1,in_feature_num,x)
and each feature input is of the shape [?, 1]. But embedding_lookup
does not accpet a shape of [?,1], and since we don't have a specified row length, I can't reshape
or unpack
it to the shape like [?].
So, how could I transform a input like
[[1],
[2],
[3],
...
]
into a embedding representation like this :
[
[....], #a vector
[....], #a vector
[....], #a vector
...
]
Related SO post are: What does tf.nn.embedding_lookup function do? and TensorFlow Embedding Lookup but those posts do not solved my problem.