0
queue = tf.FIFOQueue(batch_size * 2, dtypes=[tf.uint8, tf.float32], shapes=[[1,1080,1920,3], [1,11]], name='queue')

x_enqueue = tf.placeholder(dtype=tf.uint8, shape=[1,1080,1920,3], name="x_enqueue")
labels_enqueue = tf.placeholder(tf.float32, shape=[1, 11], name="labels_enqueue")  # todo define in loader thread

I've defined a FIFOQueue that I'm going to load from a custom function using multiple threads.

I've also defined placeholders to feed the data to the queue.

Should I define a different placeholder per thread? Or can all threads use the x_enqueue and labels_enqueue placeholders concurrently?

David Parks
  • 30,789
  • 47
  • 185
  • 328

1 Answers1

0

As far as I've been able to learn from experimentation, the answer to this is NO. Multiple calls to a session essentially snapshot a point in time view of the current values of the variables, and those values are not visible outside of a run until it has completed all processing. For some more details on how I came to this conclusion see this related question: How are variables shared between concurrent `session.run(...)` calls in tensorflow?

Community
  • 1
  • 1
David Parks
  • 30,789
  • 47
  • 185
  • 328