0

I was wondering if installation of Numba in python2.7 interfere with tensorflow-gpu installed (and working properly) on Ubuntu 16.04? I want to do gpu accelerated computations on vectors and matrices (such as vectorication) using numba, and then use those computed matrices in tensorflow deep learning models.

py study
  • 3
  • 3
  • 1
    TF tends to reserve all available GPU memory. There are methods to modify this. But if you don't do any modification, and launch a TF sesssion, you typically won't then be able to use another GPU activity like numba or pycuda because TF has taken all the GPU memory. – Robert Crovella Mar 23 '18 at 14:41
  • Thanks for the comment. That's right! I am not gonna use both at the same time. I might wanna use Numba first to deal with the matrices before launching tf session. How about the installation of Numba? Does it affect the cuda installatio for tensorflow? – py study Mar 23 '18 at 15:24

1 Answers1

1

As @Robert Crovella correctly mentioned in the comment Tensorflow will attempt to allocate the maximum amount of memory it can when you create a session. If necessary you can avoid this occurring:

How to prevent tensorflow from allocating the totality of a GPU memory?

In general it's perfectly fine to run multiple processes on the same GPU though. The primary issue will just be memory allocation. If Numba allocates the memory it needs up front you'll have no problem (disclaimer I don't know Numba well). If Numba needs to allocate memory after your tensorflow session has been allocated, then the link above will need to be followed to avoid the memory allocation issue.

I am fairly certain that you won't encounter any kind of CUDA driver conflicts between the systems. At least I've never heard of this being an issue and I've been using tensorflow daily since v0.12.

David Parks
  • 30,789
  • 47
  • 185
  • 328