2

I have an OpenGL program that uses shaders (OpenGL version 3.3, GLSL version 1.5). I have heard that the shaders actually run on the CPU unless you specifically tell it to run on the GPU. (Of course the whole reason I am using shaders is to speed up rendering by doing the calculations in the GPU instead of the CPU, so this is not the behavior I want.) Is this true? If so, how do you get the shader to run on the GPU?

Alex319
  • 3,818
  • 9
  • 34
  • 40

2 Answers2

10

I'm not sure where you heard that, but assuming you have halfway decent drivers from the graphics vendor, your shaders will run on the GPU without your doing anything but loading and using them.

Jerry Coffin
  • 476,176
  • 80
  • 629
  • 1,111
  • 1
    +1 As a note though some features on even modern cards will cause OpenGL to evaluate the shader in software mode resulting in quite horrible performance (`GL_LINE_SMOOTH` is a classic example). – Ron Warholic Nov 18 '10 at 20:49
  • Mac OS X's OpenGL stack can fallback to CPU (if something goes utterly wrong). – elmattic Nov 18 '10 at 20:52
  • Does OSX [even support](http://developer.apple.com/graphicsimaging/opengl/capabilities/) 3.3? – genpfault Nov 18 '10 at 22:40
  • @genpfault see [How do I upgrade my OpenGL from 2.1 to 3.3 on Mac OSX?](https://stackoverflow.com/q/26981152/673852) – Ruslan Apr 14 '19 at 07:14
2

I'm pretty sure that driver always tries to run every shader on GPU not on CPU. But not all modern video cards support full feature set of shader operations. For example OpenGL 4.0 extension GL_ARB_gpu_shader_fp64 as i know can be available in partial support. So driver must emulate some operations within software mode on CPU. As about extensions, you can check your card by calling glGetString with GL_EXTENSIONS parameter.

Edward83
  • 6,664
  • 14
  • 74
  • 102