12

The Tensorflow documentation states that a Variable can be used any place a Tensor can be used, and they seem to be fairly interchangeable. For example, if v is a Variable, then x = 1.0 + v becomes a Tensor.

What is the difference between the two, and when would I use one over the other?

knightian
  • 683
  • 1
  • 7
  • 20

1 Answers1

16

It's true that a Variable can be used any place a Tensor can, but the key differences between the two are that a Variable maintains its state across multiple calls to run() and a variable's value can be updated by backpropagation (it can also be saved, restored etc as per the documentation).

These differences mean that you should think of a variable as representing your model's trainable parameters (for example, the weights and biases of a neural network), while you can think of a Tensor as representing the data being fed into your model and the intermediate representations of that data as it passes through your model.

  • When a Variable is used in a model along with an `Optimizer`, are the weights in thie `Variable` always updated via backprop? Should the `Variable` objects I create be exactly the same as my set of trainable parameters, or are there parameters that I wouldn't want to be Variables, or vice versa? – knightian Jul 24 '16 at 21:25
  • 1
    Yes, as far as I understand it, every variable that has trainable=True set (the default) will be updated during backprop, provided of course that the gradient can propagate back to that particular variable (it occurs earlier in the flow of data to the loss operation). Depending on what you define as parameters, I image there would be some that you don't want updated during training, for example sequence lengths, maximum epochs, learning rate. These can be python variables, which tensorflow will automatically convert if needed or they can be constant tensors. – Avishkar Bhoopchand Jul 24 '16 at 22:39