17

Calling tf.set_random_seed(SEED) has no effect that I can tell...

For example, running the code below several times inside an IPython notebook produces different output each time:

import tensorflow as tf
tf.set_random_seed(42)
sess = tf.InteractiveSession()
a = tf.constant([1, 2, 3, 4, 5])
tf.initialize_all_variables().run()
a_shuf = tf.random_shuffle(a)
print(a.eval())
print(a_shuf.eval())
sess.close()

If I set the seed explicitly: a_shuf = tf.random_shuffle(a, seed=42), the output is the same after each run. But why do I need to set the seed if I already call tf.set_random_seed(42)?


The equivalent code using numpy just works:

import numpy as np
np.random.seed(42)
a = [1,2,3,4,5]
np.random.shuffle(a)
print(a)
ostrokach
  • 17,993
  • 11
  • 78
  • 90
  • 1
    Your code snippet only shows one evaluation of the random shuffle. How do you run it a second time to see a different result? (If I copy your snippet to a file `seed.py` and run `python seed.py` repeatedly, I do get the same results, and `tf.set_random_seed(42)` is working as intended.) – mrry Mar 19 '16 at 00:44
  • @mrry I have the code above inside a cell of an IPython notebook. Running the cell multiple times generates a different result, even though I would expect it to be the same... – ostrokach Mar 19 '16 at 18:24
  • 1
    It's also strange that if I run `!python seed.py` multiple times from inside the IPython notebook I get the same result, but if I run `%run seed.py` multiple times, I get a different result... – ostrokach Mar 19 '16 at 18:39
  • 1
    That's because you are appending ops to existing graph each time you do %run seed.py, try `print [n.name for n in tf.get_default_graph().as_graph_def().node]` to see what's in your graph – Yaroslav Bulatov Mar 19 '16 at 21:25
  • @STJ This question was asked two years earlier and has more upvotes. Why is it the duplicate and not the other way around? – ostrokach Sep 30 '19 at 14:51
  • @ostrokach both the other question and the answer of the other question seemed more thought out and contain more information, so would be overall more helpful for people with the same issue. – STJ Oct 01 '19 at 15:45

1 Answers1

13

That only sets the graph-level random seed. If you execute this snippet several times in a row, the graph will change, and two shuffle statements will get different operation-level seeds. The details are described in the doc string for set_random_seed

To get deterministic a_shuf you can either

  1. Call tf.reset_default_graph() between invocations or
  2. Set operation-level seed for shuffle: a_shuf = tf.random_shuffle(a, seed=42)
Dan Getz
  • 8,774
  • 6
  • 30
  • 64
Yaroslav Bulatov
  • 57,332
  • 22
  • 139
  • 197