2

On my MS Surface3 laptop in my app i'm using SDL2 and trying to create OpenGL context. During creation it always using integrated intel GPU instead of NVidia. And I have no idea how to choose discrete GPU. I googled entire day today, but there just some info about choosing it via Optimus? I duno what is this, i don't have any Optimus settings on my laptop. And anyway really need to choose GPU programmatically, not using Control Panel. Is it really no way to choose GPU using OpenGL API ?

I have another app, which is using DirectX 11. And I can easily choose GPU there by simple enumeration. So NVidia GPU is definitely available on my Laptop. I just don't know how to select it using OpenGL.

genpfault
  • 51,148
  • 11
  • 85
  • 139
Andrey Honich
  • 41
  • 1
  • 4

1 Answers1

4

OpenGL, or rather the Win32 GDI integration of it, doesn't offer means to explicitly select the desired device. However the drivers of Nvidia and AMD offer a workaround to have programs select, that they prefer to execute on the discrete GPU rather than the CPU integrated one.

Add this to some translation unit of your program, for example the source file that contains the main function:

#ifdef __cplusplus
extern "C" {
#endif

__declspec(dllexport) DWORD NvOptimusEnablement = 1;
__declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;

#ifdef __cplusplus
}
#endif

When the Nvidia and AMD drivers see their respective symbol exported and set to nonzero in a program, they will take precedence over the integrated GPU when creating the OpenGL context.

datenwolf
  • 159,371
  • 13
  • 185
  • 298
  • `DWORD` is defined in windows headers. If you want to avoid including them, you can use `uint32_t` instead. – tuket Feb 15 '23 at 09:29