2

According to this answer,

the value of a tf.constant() is stored multiple times in memory.

This provides a practical answer to whether to use a tensorflow constant or non-trainable variable when you have some big tensor that should not change value.

However, it is not clear to me why both exist, and why (and under which circumstances) tf.constant would be replicated in memory.

Community
  • 1
  • 1
erickrf
  • 2,069
  • 5
  • 21
  • 44

1 Answers1

1

If you do W = tf.constant(embedding, name="W") then the value of the embedding is stored twice -- on the numpy side in embedding and on the TensorFlow side in W op. Note that constant values are stored in Graph object which is not optimized for large parallel data transfers (at least there were performance complaints before acac487a ), meanwhile Variable value store is optimized

Yaroslav Bulatov
  • 57,332
  • 22
  • 139
  • 197