4

I am running Windows 10 on Intel Core i7-8700 CPU with 16 GB RAM, 1 TB HDD and dedicated NVIDIA GeForce GTX 1070 graphics card.

I plan to launch 3 Ubuntu instances hosted by my Windows 10 PC. The Ubuntus will be running Distributed Tensorflow (tensorflow-gpu) code, that will using GPU for training a Neural Network. (to mention, already I've tried the setup on Windows but failed)

Q. Shall my NVIDIA GPU be virtualized among those Virtual Machines or Not?

  • If YES, then is there any further configurations required to make this happen?

  • If NOT, then is there any suggestions to build such experimental environment for Distributed Tensorflow?

N.B.

Thanks.

Anjum
  • 183
  • 9
  • I think your rig has not enough power. It's a pretty cheap configuration :/ – Simo Sep 11 '18 at 09:47
  • 1
    Possible duplicate of [How to install tensorflow GPU version on VirtualBox Ubuntu OS. And host OS is windows 10](https://stackoverflow.com/questions/49345786/how-to-install-tensorflow-gpu-version-on-virtualbox-ubuntu-os-and-host-os-is-wi) – Jeremy Visser Sep 11 '18 at 09:53
  • Thanks @Simo, I understand its very basic and I am not expecting to run this in production. I would be interested to know if it is technically possible to virtualize this graphics card or not? – Anjum Sep 11 '18 at 09:56
  • @Simo Perhaps not for serious work but it's enough to get started – rath Sep 11 '18 at 09:56
  • @ShakeelAnjum I mean, it's a great config but not for what you are looking for. I don't know how it will work on 3 virtualized Ubuntu, but in my humble opinion it will be "meh" Anyway you could always give it a try :D – Simo Sep 11 '18 at 10:04
  • 2
    There is no supported way to use your GPU for TensorFlow in a virtualized OS with a Windows host (not supported by Nvidia, e.g. see [here](https://devtalk.nvidia.com/default/topic/1032226/cuda-programming-and-performance/can-i-run-cuda-on-virtual-machine/)). The closest thing you can have is a Linux host with GPU-enabled Docker instances using [nvidia-docker](https://github.com/NVIDIA/nvidia-docker) (not quite a virtual machine but probably close enough). – jdehesa Sep 11 '18 at 10:10
  • Thanks @jdehesa, I just found [this link](https://www.reddit.com/r/nvidia/comments/5m07h2/pci_passthrough_in_esxi_for_consumer_nvidia_cards/) in the NVIDIA post you referred to. Though not officially supported but it's Possible to virtualize these consumer GPUs with ESXI setup, as told by several people there. I am going to look into the NVidia-docker you mentioned, but still not giving up with VMs :) – Anjum Sep 11 '18 at 10:41
  • Thanks @JeremyVisser, I'm already using tensorflow-gpu as mentioned in the post you referred, and need your valued input to confirm if the GPU passthrough [discussed here](https://www.reddit.com/r/nvidia/comments/5m07h2/pci_passthrough_in_esxi_for_consumer_nvidia_cards/) is only for PCI or CUDA as well? – Anjum Sep 11 '18 at 10:45

1 Answers1

1

I would consider @jdehesa's answer as for now there seems no way to virtulize GPU on Windows for Tensorflow. Thanks to @jdehesa

Anjum
  • 183
  • 9