I'm new to TensorFlow 2 and reading the docs: https://www.tensorflow.org/api_docs/python/tf/Module
On this page, the part related to my question is: MLP
(copy-paste from there):
class MLP(tf.Module):
def __init__(self, input_size, sizes, name=None):
super(MLP, self).__init__(name=name)
self.layers = []
with self.name_scope:
for size in sizes:
self.layers.append(Dense(input_dim=input_size, output_size=size))
input_size = size
@tf.Module.with_name_scope
def __call__(self, x):
for layer in self.layers:
x = layer(x)
return x
and I don't understand why the output of the following:
>>> module = MLP(input_size=5, sizes=[5, 5])
>>> module.variables
(<tf.Variable 'mlp/b:0' shape=(5,) ...>,
<tf.Variable 'mlp/w:0' shape=(5, 5) ...>,
<tf.Variable 'mlp/b:0' shape=(5,) ...>,
<tf.Variable 'mlp/w:0' shape=(5, 5) ...>,
)
Where I expect mlp/b:1
and mlp/w:1
would appear. I also tried the same code on my machine and got the same result on name
, i.e. both mlp/b:0
and mlp/w:0
appear twice. Can anyone help me point out which point I have missed? Would the result mean that the same W
, b
are reused?