3

I'm searching for a data leak in my model. I'm using tf.layers.dense before a masking operation and am concerned that the model could just learn to switch positions in the middle dimension of my input tensor.

When I have an input tensor x = tf.ones((2,3,4)) would tf.layers.dense(x,8) flatten x to a fully connected layer with 2*3*4=24 input neurons and 2*3*8=48 output neurons then reshape it again to [2,3,8], or would it create 2*3=6 fully connected layers with 4 input and 8 output neurons then concatenate them?

today
  • 32,602
  • 8
  • 95
  • 115
Philphis
  • 33
  • 1
  • 5
  • 1
    Since you have included "keras" tag: [No, in Keras the input of Dense layer is not flattened](https://stackoverflow.com/a/52092176/2099607). – today Oct 18 '18 at 11:53
  • And [neither in Tensorflow](https://www.tensorflow.org/api_docs/python/tf/layers/dense): "**Returns:** Output tensor the same shape as `inputs` except the last dimension is of size `units`." – today Oct 18 '18 at 11:56
  • "Returns: Output tensor the same shape as inputs except the last dimension is of size units." only determines the output shape not how it calculates it, does it? Both ways described in my question would return this shape, when I'm not mistaken. – Philphis Oct 18 '18 at 15:35
  • Thank you for pointing out the wrong tag, thought it would use the same underlying functions - not so sure anymore. – Philphis Oct 18 '18 at 15:38

1 Answers1

2

As for the Keras Dense layer, it has been already mentioned in another answer that its input is not flattened and instead, it is applied on the last axis of its input.

As for the TensorFlow Dense layer, it is actually inherited from Keras Dense layer and as a result, same as Keras Dense layer, it is applied on the last axis of its input.

today
  • 32,602
  • 8
  • 95
  • 115