1

I have a modern CPU (AMD FX 4170) and a modern GPU (NVidia GTX 660). Yet this simple program manages to fully use one of my CPU's cores. This means it uses one 4.2 GHz core to draw nothing at 60 FPS. What is wrong with this program?

#include <SDL/SDL.h>

int main(int argc, char** argv)
{
    SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO);
    SDL_SetVideoMode(800, 600, 0, SDL_OPENGL | SDL_RESIZABLE);

    while(true)
    {
        Uint32 now = SDL_GetTicks();
        SDL_GL_SwapBuffers();

        int delay = 1000 / 60 - (SDL_GetTicks() - now);
        if(delay > 0) SDL_Delay(delay);
    }

    return 0;
}
Scintillo
  • 1,634
  • 1
  • 15
  • 29

2 Answers2

4

It turns out that NVidia's drivers' implement waiting for vsync with a busy loop which causes SDL_GL_SwapBuffers() to use 100 % CPU. Turning off vsync from NVidia Control Panel removes this problem.

Scintillo
  • 1,634
  • 1
  • 15
  • 29
3

Loops use as much computing power as they can. The main problem may be located in:

int delay = 1000 / 60 - (SDL_GetTicks() - now);

your delay duration may be less than zero so that your operation may be just an infinite loop without waiting. You need to control the value of variable delay.

Moreover, in the this link: it is proposed that

SDL_GL_SetAttribute(SDL_GL_SWAP_CONTROL,1); can be used to enable vsync so that it will not use all the CPU

fatihk
  • 7,789
  • 1
  • 26
  • 48
  • But that is true only if the rest of the loop takes 1 / 60 of a second? It shouldn't take that long on a modern system. – Scintillo Jun 15 '13 at 06:11
  • @Scintillo, if that method takes more than 15 miliseconds, it causes this situation, could you check it with std::cout< – fatihk Jun 15 '13 at 06:17
  • Indeed SDL_GL_SwapBuffers(); uses that time. Question is why it does that? It could be waiting for vsync but why it would use full CPU while doing it? – Scintillo Jun 15 '13 at 06:49