queue = tf.FIFOQueue(batch_size * 2, dtypes=[tf.uint8, tf.float32], shapes=[[1,1080,1920,3], [1,11]], name='queue')
x_enqueue = tf.placeholder(dtype=tf.uint8, shape=[1,1080,1920,3], name="x_enqueue")
labels_enqueue = tf.placeholder(tf.float32, shape=[1, 11], name="labels_enqueue") # todo define in loader thread
I've defined a FIFOQueue that I'm going to load from a custom function using multiple threads.
I've also defined placeholders to feed the data to the queue.
Should I define a different placeholder per thread? Or can all threads use the x_enqueue
and labels_enqueue
placeholders concurrently?