I'm searching for a data leak in my model. I'm using tf.layers.dense
before a masking operation and am concerned that the model could just learn to switch positions in the middle dimension of my input tensor.
When I have an input tensor x = tf.ones((2,3,4))
would tf.layers.dense(x,8)
flatten x
to a fully connected layer with 2*3*4=24 input neurons and 2*3*8=48 output neurons then reshape it again to [2,3,8]
, or would it create 2*3=6 fully connected layers with 4 input and 8 output neurons then concatenate them?