We're developing software which uses DirectX for 3D rendering on Windows 7 and later machines, 64-bit C#/.NET code.
We've observed that a number of newer Dell laptops we're testing on have dual video cards. They have the Intel HD 4600 integrated graphics and they also have a faster NVIDIA Quadro card (for example).
By default, out of the box, the Intel graphics are used by the DirectX application. This is done, presumably to preserve battery life. But the performance is noticeably worse than the NVIDIA card.
Using the NVIDIA control panel, the user can control which one is used by default. As soon as the user switches it to use the NVIDIA card, the performance sees a big jump for the better.
So, my question is.... Is there any way to, in code, detect this setting and/or modify it for our application (on install and/or on launch)? Can we detect that for our app the Intel card is being used and if its one of these dual card scenarios, prompt the user and perhaps (if they request it) change the setting for them?
As it is currently we have to walk the users through manually making the change in the NVIDIA control panel.
Anyone else have any experience dealing with this and have any advice on how to proceed?