1

This question is related to Will scikit-learn utilize GPU? but I don't think offers same answer. I'm executing scikit-learn algorithms against an Nvidia GPU without error so assume scikit is running on the underlying hardware. As scikit-learn is not designed to execute against GPU what is process that enables the algorithms to run ?

For example I'm running executing scikit-learn algorithms using Gigabyte Nvidia GTX 1060 WF2 3GB GDDR5 PCI-E with spec :

1152 NVIDIA CUDA Cores
1582MHz Base/1797MHz Boost Clock (OC Mode) or 1556MHz Base/1771MHz Boost Clock (Gaming Mode)
3GB GDDR5 8008MHz Memory

Using scikit-learn are some of the cores not being executed against ?

Update :

I use Nvidia docker container to run container on GPU as specified : https://github.com/NVIDIA/nvidia-docker. I've installed scikit on this container so scikit-learn algorithms are being executed on GPU ?

smci
  • 32,567
  • 20
  • 113
  • 146
blue-sky
  • 51,962
  • 152
  • 427
  • 752

2 Answers2

4

scikit-learn does not and can not run on the GPU. See this answer in the scikit-learn FAQ.

olieidel
  • 1,505
  • 10
  • 10
  • 2
    how do you measure that scikit-learn is running on your GPU? It is not. It's impossible. – olieidel Oct 01 '17 at 23:28
  • 1
    @blue-sky You should read some introduction to Docker if you think running a docker-container build for GPU-support does mean: *anything which runs inside my container, will run on GPU*, which is wrong by the way. – sascha Oct 02 '17 at 00:38
3

From my experience, I use this package to utilize GPU for some sklearn algorithms in here.

The code I use:

from sklearnex import patch_sklearn
from daal4py.oneapi import sycl_context
patch_sklearn()

Source: oneAPI and GPU support in Intel(R) Extension for Scikit-learn

M.Vu
  • 397
  • 2
  • 9