4

As I use the following command after importing tensorflow in python 2.7: sess = tf.Session()

Warnings/errors:

tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations.

2017-02-02 00:41:48.616602: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.

2017-02-02 00:41:48.616614: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.

2017-02-02 00:41:48.616624: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use FMA instructions, but these are available on your machine and could speed up CPU computations.

Please help me fix this so I may use my machine at its optimal power.

Patrick
  • 5,526
  • 14
  • 64
  • 101
va4az
  • 71
  • 1
  • 4
  • Hi, how did you go about installing tensorflow? Using the `pip` way or did you build it from source? – Giridhur Feb 02 '17 at 09:26
  • Must build from source using -march=native flag – Yaroslav Bulatov Feb 02 '17 at 15:47
  • @Gridhur I had built it from source using some directions given at the following websites: [link](https://alliseesolutions.wordpress.com/2016/09/08/install-gpu-tensorflow-from-sources-w-ubuntu-16-04-and-cuda-8-0-rc/) – va4az Feb 27 '17 at 22:57
  • 1
    Possible duplicate of [How to compile Tensorflow with SSE4.2 and AVX instructions?](http://stackoverflow.com/questions/41293077/how-to-compile-tensorflow-with-sse4-2-and-avx-instructions) – Tai Christian Apr 05 '17 at 21:21

2 Answers2

2

Those warnings are just saying if you build TensorFlow from source it can run faster on your machine. There is no fix as it's not an issue but intended behavior to provide this information to users.

Those CPU instructions were not enabled by default to provide a broader compatibility with most machines.

As the docs says:

TensorFlow checks on startup whether it has been compiled with the optimizations available on the CPU. If the optimizations are not included, TensorFlow will emit warnings, e.g. AVX, AVX2, and FMA instructions not included.

For all details on that see the Performance Guide.

Adriano
  • 750
  • 5
  • 9
  • @ Adriano, I had built it from source. – va4az Feb 27 '17 at 22:55
  • @va4az When you were configuring TensorFlow you were prompted which flags would you like to enable, currently if you let it at default value, it enables the instructions your CPU supports and TensorFlow builds with them. Or you don't have those available or you didn't build with them. Use `gcc -march=native -Q --help=target | grep enable` to get the optimizations available. See [this question](http://stackoverflow.com/questions/41293077/how-to-compile-tensorflow-with-sse4-2-and-avx-instructions) for more info on compiling with those instructions – Adriano Feb 27 '17 at 23:34
0

These warnings you see, are telling you that the compiled code does not use these instructions which you have, but not all CPUs out there. When maintainers compile codes for repositories, they need to compile it such that it supports majority of CPUs out there, which means they tell the compiler to use architecture specific instructions.

If you want the package to use all the instructions you have, you need to compile it yourself, or as it's called install from source. You can find documentation about how to do that here, and once you're comfortable compiling tensorflow from source, then you should go and read the performance specific instructions.

However, at the end of the day, for realworld applications you might really need a GPU. It is true that these CPU instructions give you a bit of performance boost, but that's not comparable to using a GPU.

adrin
  • 4,511
  • 3
  • 34
  • 50
  • `However, at the end of the day, for realworld applications you might really need a GPU` That is not true. Often the opposite for inference. – Michael Ramos Aug 31 '20 at 01:53