2

I create two Tensors (namely: x1, y2) which initialized with uniform distribution, But when I print out the result they were not what I expected.

This is my code:

x1 = tf.random_uniform([1], 0, 10, tf.int32)
y1 = tf.random_uniform([1], 0, 10, tf.int32)

subtraction = x1 - y1

with tf.Session() as sess:

    print(sess.run(x1))
    print(sess.run(y1))
    print(sess.run(subtraction))

This is the result:

[6]

[2]

[0]

Community
  • 1
  • 1
Amir
  • 16,067
  • 10
  • 80
  • 119

1 Answers1

3

In your code, x1 and y1 are random number generators. They take different values each time they are called. So when you call subtraction, which in turns call your number generators x1 and y1, there is no reason to obtain results that are consistent with previous calls.

To achieve what you are looking for, store the values in a Variable:

import tensorflow as tf

x1 = tf.Variable(tf.random_uniform([1], 0, 10, tf.int32))
y1 = tf.Variable(tf.random_uniform([1], 0, 10, tf.int32))

subtraction = x1 - y1

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    print(sess.run(x1))
    print(sess.run(y1))
    print(sess.run(subtraction))

Alternatively, if you don't need persistence between iterations and can call all the operators relying on your number generators at once, pack them into the same call to sess.run:

import tensorflow as tf

x1 = tf.random_uniform([1], 0, 10, tf.int32)
y1 = tf.random_uniform([1], 0, 10, tf.int32)

subtraction = x1 - y1

with tf.Session() as sess:
    print(sess.run([x1, y1, subtraction]))
P-Gn
  • 23,115
  • 9
  • 87
  • 104