First of all, I am aware that a related question has been asked here.
However, this question is about the implementation and internals. I was reading the paper "A Tour of TensorFlow". The following two points are quoted from there:
1.
A tensor itself does not hold or store values in memory, but provides only an interface for retrieving the value referenced by the tensor.
This suggests to me that a Tensor is an object that simply stores the pointer to a result of an operation and, on retrieving the result or value of the tensor, it simply dereferences that pointer.
2.
Variables can be described as persistent, mutable handles to in-memory buffers storing tensors. As such, variables are characterized by a certain shape and a fixed type.
At this I get confused because I thought, based on the previous point, that Tensors simply store a pointer. If they were simply pointers, they could be mutable as well.
To be precise these are my questions:
- What is the meaning of "in-memory buffers"?
- What is the meaning of a "handle"?
- Is my initial assumption about the internals of a tensor correct?
- What is the essential internal implementation difference between a tensor and a variable? Why are they declared differently and why is that difference essential to TensorFlow?