24

In numpy I can create a copy of the variable with numpy.copy. Is there a similar method, that I can use to create a copy of a Tensor in TensorFlow?

randomizer
  • 805
  • 2
  • 7
  • 8

4 Answers4

37

You asked how to copy a variable in the title, but how to copy a tensor in the question. Let's look at the different possible answers.

(1) You want to create a tensor that has the same value that is currently stored in a variable that we'll call var.

tensor = tf.identity(var)

But remember, 'tensor' is a graph node that will have that value when evaluated, and any time you evaluate it, it will grab the current value of var. You can play around with control flow ops such as with_dependencies() to see the ordering of updates to the variable and the timing of the identity.

(2) You want to create another variable and set its value to the value currently stored in a variable:

import tensorflow as tf
var = tf.Variable(0.9)
var2 = tf.Variable(0.0)
copy_first_variable = var2.assign(var)
init = tf.initialize_all_variables()
sess = tf.Session()

sess.run(init)

print sess.run(var2)
sess.run(copy_first_variable)
print sess.run(var2)

(3) You want to define a variable and set its starting value to the same thing you already initialized a variable to (this is what nivwu.. above answered):

var2 = tf.Variable(var.initialized_value())

var2 will get initialized when you call tf.initialize_all_variables. You can't use this to copy var after you've already initialized the graph and started running things.

Community
  • 1
  • 1
dga
  • 21,757
  • 3
  • 44
  • 51
  • `with_dependencies()` changed, see: http://stackoverflow.com/questions/37980078/tensorflow-has-no-attribute-with-dependencies – David Parks Mar 30 '17 at 02:40
  • @dga regarding your point (1), would it not be better to just create a constant tensor somehow along the lines of tensor = tf.constant(var) such that the copy won't be dependent on var? – Alex Feb 09 '18 at 22:14
10

You can do this in a couple of ways.

  • this will create you a copy: v2 = tf.Variable(v1)
  • you can also use identity op: v2 = tf.identity(v1) (which I think is a proper way of doing it.

Here is a code example:

import tensorflow as tf

v1 = tf.Variable([[1, 2], [3, 4]])
v_copy1 = tf.Variable(v1)
v_copy2 = tf.identity(v1)

init = tf.initialize_all_variables()
sess = tf.Session()
sess.run(init)
a, b = sess.run([v_copy1, v_copy2])
sess.close()

print a
print b

Both of them would print the same tensors.

Salvador Dali
  • 214,103
  • 147
  • 703
  • 753
  • It is not specified if we want shallow or deep copy. I think it's safer to assume that it's about deep copy. in that case tf.Variable(source_variable.initialized_value()) – siemanko Nov 18 '15 at 22:53
  • @nivwusquorum can you show me the example when the identity will fail? – Salvador Dali Nov 18 '15 at 23:00
  • deepq learning requires a deep copy of q network in place of target network for example.. – siemanko Nov 19 '15 at 23:36
  • 2
    @niwu - there's no "shallow" or "deep" copy of a tensorflow variable. Variables hold tensors, and tensors don't have pointers. But please see my answer for why just doing v_copy1 = tf.Variable(v1) may not be correct either -- you're right that it may try to grab the value of v1 before v1 has been initialized, whereas using .initialized_value() will add a control dependency so that v1 gets initialized before the grab. But the terms "shallow" and "deep" copy have a specific technical meaning that doesn't apply here. – dga Nov 20 '15 at 04:45
  • This approach is not working for me for some reasons. I restore model and a variable which is a `tf.Variable()`. However, when I create another variable using `tf.identity()` and run it in a session, it prints out different values. – ARAT Nov 05 '18 at 16:54
6

This performs a deep copy

copied_variable = tf.Variable(source_variable.initialized_value())

It also handles intialization properly, i.e.

tf.intialize_all_variables()

will properly initialize source_variable first and then copy that value to copied_variable

siemanko
  • 1,389
  • 1
  • 13
  • 26
1

In TF2 : tf.identity() will do the good deed for you. Recently I encountered some problems using the function in google colab. In case that's why you're here, this will be helping you.

Error : Failed copying input tensor from /job:localhost/replica:0/task:0/device:CPU:0 to /job:localhost/replica:0/task:0/device:GPU:0 in order to run Identity: No unary variant device copy function found for direction: 1 and Variant type_index: tensorflow::data::(anonymous namespace)::DatasetVariantWrapper [Op:Identity]

#Erroneous code
tensor1 = tf.data.Dataset.from_tensor_slices([[[1], [2]], [[3], [4]]])
tensor2 = tf.identity(tensor1)
#Correction
tensor1 = tf.data.Dataset.from_tensor_slices([[[1], [2]], [[3], [4]]])
with tf.device('CPU'): tensor2 = tf.identity(tensor1)
Himanshu Tanwar
  • 198
  • 1
  • 11