27

I'm programming a DirectX game, and when I run it on an Optimus laptop the Intel GPU is used, resulting in horrible performance. If I force the NVIDIA GPU using the context menu or by renaming my executable to bf3.exe or some other famous game executable name, performance is as expected.
Obviously neither is an acceptable solution for when I have to redistribute my game, so is there a way to programmatically force the laptop to use the NVIDIA GPU?

I've already tried using DirectX to enumerate adapters (IDirect3D9::GetAdapterCount, IDirect3D9::GetAdapterIdentifier) and it doesn't work: only 1 GPU is being reported (the one in use).

Smohn Jith
  • 305
  • 1
  • 3
  • 8

2 Answers2

40

According to http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf starting from 302 drivers it is enough to link statically with one of the following libraries: vcamp110.dll, vcamp110d.dll, nvapi.dll, nvapi64.dll, opencl.dll, nvcuda.dll, cudart*.*, or to export a NvOptimusEnablement variable in your program:

extern "C" {
    _declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
}
DikobrAz
  • 3,557
  • 4
  • 35
  • 53
  • 3
    +1 for the enablement-variable (and corresponding link); Note: Did not work for my 310-driver, but worked after update to 320 – NobodysNightmare Aug 28 '13 at 09:49
  • 1
    Should be noted that the Preferred graphics processor setting in the NVIDIA Control Panel has to be set to "Auto-select" for this to work. – Nathan Mar 29 '15 at 05:41
  • 2
    This doesn't work anymore (GTX 960M - driver v385.41) – Matthias Aug 29 '17 at 07:25
  • Works on QuadroM, 445.71 driver, both intel and nv GPUs on laptop, it selects Nvidia when this variable is exposed, and selects Intel when not exposed. – Prabindh Aug 27 '20 at 04:12
5

The Optimus whitepaper at http://www.nvidia.com/object/LO_optimus_whitepapers.html is unclear on exactly what it takes before a switch to GPU is made. The whitepaper says that DX, DXVA, and CUDA calls are detected and will cause the GPU to be turned on. But in addition the decision is based on profiles maintained by NVIDIA and, of course, one does not yet exist for your game.

One thing to try would be make a CUDA call, for instance to cuInit(0);. As opposed to DX and DXVA, there is not way for the Intel integrated graphics to handle that, so it should force a switch to the GPU.

Roger Dahl
  • 15,132
  • 8
  • 62
  • 82
  • Did anyone have actual success with this method ? I just tried to call `cuInit` before creating my OpenGL context, and I get an Intel context not an NVidia one. – rotoglup Aug 02 '12 at 11:42
  • @rotoglup: Please let us know which OS you're running and, if you find a solution, add it as an answer to this question. – Roger Dahl Aug 03 '12 at 16:31
  • I'm running Win7 home x64 SP1 + NVidia 296.16 drivers. I'm quite clueless about how to proceed : the [NVIDIA CUDA Developer Guide for NVIDIA Optimus Platforms](http://developer.download.nvidia.com/compute/cuda/docs/CUDA_Developer_Guide_for_Optimus_Platforms.pdf) doc mentions that binding an OpenGL context to CUDA don't work on Optimus platforms, it says that you this requires an 'application profile' in NVidia Control Panel... Maybe NVAPI allows to programmatically create such a profile, but NVidia developer website is down ATM. – rotoglup Aug 04 '12 at 13:34