Optimus is a new technology introduced to cover a machine which uses both integrated graphics and a graphics card. In linux machines it must be configured using bumblebee.service.
Questions tagged [optimus]
48 questions
27
votes
2 answers
Forcing NVIDIA GPU programmatically in Optimus laptops
I'm programming a DirectX game, and when I run it on an Optimus laptop the Intel GPU is used, resulting in horrible performance. If I force the NVIDIA GPU using the context menu or by renaming my executable to bf3.exe or some other famous game…

Smohn Jith
- 305
- 1
- 3
- 8
18
votes
2 answers
Resigning system.img on a device
I am working on an automatic app updating solution for devices (LG p509 - Optimus 1) which we deploy to our customers. We have control of these devices and currently install a custom kernel on them (but not a full custom ROM). Since we are trying to…

natez0r
- 1,064
- 9
- 14
15
votes
6 answers
Forcing hardware accelerated rendering
I have an OpenGL library written in c++ that is used from a C# application using C++/CLI adapters. My problem is that if the application is used on laptops with Nvidia Optimus technology the application will not use the hardware acceleration and…

JohanR
- 151
- 1
- 4
6
votes
1 answer
Optimus headless browser with C#
Can somebody tell me how to use Optimus (headless browser) nuget package with C# to get response from a URL. I also want javascript on the page to be executed automatically like phantomjs.

Puneet Pant
- 918
- 12
- 37
6
votes
1 answer
Per-monitor DPI-Aware: black window glitch with NVIDIA Optimus
I would like to make a Per-Monitor DPI-Aware Direct2D application. I have extended Microsoft's "First Direct2D Program" example to handle WM_DPICHANGED as explained in Kenny Kerr's MSDN article. This works when both monitors use one video card, but…

Lack
- 1,625
- 1
- 17
- 29
4
votes
1 answer
How can I check which Version of OpenGL is supported on a linux system with optirun?
I have had a lot of problems / confusion setting up my laptop to work for OpenGL programming / the running of OpenGL programs.
My laptop has one of these very clever (too clever for me) designs where the Intel CPU has a graphics processor on chip,…

FreelanceConsultant
- 13,167
- 27
- 115
- 225
3
votes
0 answers
GPU benchmark for nvidia Optimus cards
I need a GPGPU benchmark which will load the GPU so that I can measure the parameters like temperature rise, amount of battery drain etc. Basically I want to alert the user when the GPU is using a lot of power than normal use. Hence I need to decide…

Anup Warnulkar
- 773
- 1
- 8
- 25
3
votes
1 answer
TTS text received & processed but NOT HEARD on LG Optimus S
On one hand, this problem is tough because I have the same exact code working perfectly on 3 different Android 2.2 phones, but not working on an LG Optimus S (runing Android 2.2, too).
On the other hand, this problem is reproducible, so there may be…

an00b
- 11,338
- 13
- 64
- 101
3
votes
0 answers
How do I tell the Java Invocation Api which java vm to use?
I'm trying to create a launcher for a game I wrote in java using the java invocation api in c++. I'd like it to use the jre folder (from openjdk) that I bundled with the game so that people who don't have Java on their computers can still play the…

hujasonx
- 106
- 3
3
votes
0 answers
Enable/disable nvidia Optimus programmatically using C++ under Ubuntu amd64
EDIT: This is imo not an exact duplicate, because this question is for a solution spesific to Ubuntu, while the other is for a cross-platform solution.
In order to save power it is common in recent graphics architectures to dynamically switch…

Mr. Developerdude
- 9,118
- 10
- 57
- 95
3
votes
1 answer
Android Studio / AVD with bumblebee / nvidia optimus
When using a computer with linux and a nvidia optimus graphics card for Android Development, is it possible to tell Intellij / Android Studio to launch the Android Virtual Devices with bumblebee?

Eknoes
- 508
- 1
- 11
- 24
3
votes
0 answers
OpenGL shadow mapping weirdness with uniform array
I was trying to run a little game/demo written by a friend of mine for mostly educational purposes, Land of Dreams. I noticed some extremely strange behaviour on my computer, even though the application was reportedly run successfully on several…

Kristóf Marussy
- 1,202
- 8
- 18
3
votes
1 answer
NVIDIA Optimus card not switching under OpenGL
When I used use "glGetString(GL_VERSION)" and "glGetString(GL_SHADING_LANGUAGE_VERSION)" to check the OpenGL version on my computer, I got the following information:
3.1.0 - Build 8.15.10.2538 for GL_VERSION
1.40 - Intel Build 8.15.10.2538 for…

Amy
- 33
- 1
- 3
3
votes
1 answer
OpenGL 3.3 Two different results on two GPUs nVidia Optimus with Shadow mapping
So i'm working on a project (to both learn and create a game in the future) in C++ and for rendering i've chose OpenGL 3.3. I've been working on Intel HD 4000 built in my processor as it opens default new apps and everything went sweet. But then…

RippeR
- 1,472
- 1
- 12
- 23
3
votes
0 answers
Weird VGL Notice - [VGL] NOTICE: Pixel format of 2D X server does not match pixel format of Pbuffer. Disabling PBO readback
I'm porting a game that I wrote from Windows to Linux. It uses GLFW and OpenGL. When I run it using optirun, to take advantage of my nVidia Optimus setup, it spits this out to the console:
[VGL] NOTICE: Pixel format of 2D X server does not match…

drautb
- 505
- 6
- 12