21

I am part of a team developing an application using C++ with SDL and OpenGL.

On laptops when the application is ran the dedicated graphics card is not used and the GL context fails to create because the integrated graphics card does not support the version of GL we want.

I have a feeling that this problem is specific to the laptop in question and not something we can solve through code. But, if anyone knows if there is a solution that'd be great.

Connor Hollis
  • 1,115
  • 1
  • 7
  • 13
  • Do you mean laptops with dual graphics cards? Does manually switching to the dedicated card help (in nvidia settings or wherever it is)? – riv May 29 '13 at 20:43
  • You might be able to use the target platform(s) specific API(s) to access what devices are available then pick which one to create the active context on. Though I have a feeling you are right and the inactive graphics device will not show up until turned on in the settings for the laptop as suggested by @riv. – kc7zax May 29 '13 at 20:46
  • 1
    @riv yes it is a laptop with dual graphics cards. We can of course add the application to the list of applications that use the dedicated card in the nvidia/ati settings but for end users we would prefer they don't have to do that. – Connor Hollis May 29 '13 at 21:05
  • 2
    The replies with `__declspec(dllexport)` are old and specific to the Nvidia optimus driver. Windows 10 has now it's own way to configure high performance GPU (see https://pureinfotech.com/set-gpu-app-windows-10/). Are the replies still up-to-date, or is there a vendor neutral way to achive this in Windows 10 in the meantime? – jcm Oct 28 '21 at 08:25

2 Answers2

36

The easiest way from C++ to ensure that the dedicated graphics card is used instead of chipset switchable graphics under Windows is to export the following symbols (MSVC sample code):

Enable dedicated graphics for NVIDIA:

extern "C" 
{
  __declspec(dllexport) unsigned long NvOptimusEnablement = 0x00000001;
}

Enable dedicated graphics for AMD Radeon:

extern "C"
{
  __declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
}

Caveat: If the user has created a profile for the application to use integrated chipset, then these will not work.

I am unsure if this would work similarly under Linux / MacOS (unlikely).

Community
  • 1
  • 1
Christopher Oezbek
  • 23,994
  • 6
  • 61
  • 85
  • You have `__declspec` for one variable and `_declspec` for another. Typo or am I missing something? – HolyBlackCat Aug 20 '16 at 19:03
  • Thanks for noticing that! Two underscores is more correct, but _declspec is working as expected: http://stackoverflow.com/questions/1399215/difference-between-declspec-and-declspec – Christopher Oezbek Aug 20 '16 at 21:23
  • 1
    Hey, How could I use that in a c# project ? (WPF) – user2088807 Feb 16 '17 at 10:09
  • user2088807 try using DllExport in C# . Anyways why would a wpf app need discrete graphics ? Also i think that this works by the AMD / Nvidia driver checking if the app has this DllExported and if it is it turns on the discrete GPU – Suici Doga Apr 19 '17 at 15:42
  • 1
    Awesome, thanks. Didn't expect it to work, but it does, at least with Nvidia! Also, didn't know you can `dllexport` stuff from .exe. – Violet Giraffe May 08 '17 at 20:07
  • Any info how to do it in mac os? – Jeka Mar 06 '19 at 20:07
  • For .net (c#,vb) see https://stackoverflow.com/questions/17270429/forcing-hardware-accelerated-rendering – Dariusz Wasacz May 13 '20 at 14:06
  • 1
    Should these symbols be exported in main executable (.exe) or can be in one of it's DLLs? – Roman Khvostikov Feb 01 '21 at 15:03
  • 1
    @RomanKhvostikov It has to be from the main executable, or from a library that is statically linked to it. DLLs do not work. – pvallet May 05 '21 at 17:16
3

Does it use NVidia dedicated graphics? AFAIK, the process of automatically switching from integrated to dedicated is based on application profiles. Your application is not in the driver's list of known 3D applications, and therefore the user has to manually switch to the dedicated GPU.

Try changing the executable name of your application to something the driver looks for. For example "Doom3.exe". If that works, then you've found your problem.

If that didn't help, try following the instructions on how to make the driver insert your application in its list of 3D apps:

https://nvidia.custhelp.com/app/answers/detail/a_id/2615/~/how-do-i-customize-optimus-profiles-and-settings

But the above is only for verifying if this is indeed your problem. For an actual solution to this, you should check with the graphics drivers vendors (AMD and NVidia) on the best way to insert a profile for your application into their lists. NVidia provides NVAPI and AMD has ADL and AGS. They're definitely worth a study.

mirh
  • 514
  • 8
  • 14
Nikos C.
  • 50,738
  • 9
  • 71
  • 96
  • 2
    The goal is to prevent the end user from having to add the application to the list of applications that use the dedicated graphics card. This could also occur on non-nvidia devices as well. I'll take a look at the link you sent. Is there a similar solution for ATI cards? – Connor Hollis May 29 '13 at 21:09
  • 1
    @ConnorHollis: The places where the application profiles are stored are well known. The straigtforward solution is to have the installer add an application profile for the drivers of AMD and NVidia. – datenwolf May 29 '13 at 21:49
  • 1
    But you should use the Nvapi from Nvidia for creating the application profile instead of writing to those places yourself as the location of that information has already changed in the past and might change again at any moment. – Daniel Flassig May 30 '13 at 10:58
  • @ConnorHollis These APIs are definitely worth a look. I've put links to them in the answer. – Nikos C. May 30 '13 at 11:57