1

I am running TensorFlow 0.10.0rc0 on an Ubuntu 14.04 server with 48 cores and python 2.7. I noticed that I have an unusual high thread count. I thought found the explanation that TensorFlow will spawn multiple threads for two threadpools and that they can be controlled by:

sess = tf.Session(
               config=tf.ConfigProto(inter_op_parallelism_threads=NUM_CORES,
                   intra_op_parallelism_threads=NUM_CORES))

However, that does not work and after some investigation, I found out that just calling import tensorflow as tf increases the thread count already by around 50-60.

Why is this happening? How can I prevent it and limit the actual number of threads? What are those extra threads doing if I limit the number of inter_op_parallelism_threads and intra_op_parallelism_threads?

Community
  • 1
  • 1
Dominik Müller
  • 575
  • 1
  • 9
  • 16
  • `inter_op_parallelism_threads` controls how many simultaneous TensorFlow kernel calls can be launched, and `intra_op_parallelism_threads` controls size of Eigen threadpool. But TensorFlow treats threads as a cheap resource and spawns them for various things like callbacks. You can find more info by finding their names and tracking them down in code – Yaroslav Bulatov Aug 19 '16 at 20:39
  • Possible duplicate of the following question and answer. https://stackoverflow.com/questions/62141945/keras-tf-cpu-creating-too-many-threads/64489297#64489297 – fisakhan Oct 22 '20 at 19:34

0 Answers0