I am trying to limit a SDL program to 60 FPS and calculate the FPS using this code:
static const Uint32 min_frame_time = 16;
Uint32 start_time = SDL_GetTicks();
// Rendering stuff...
time_delta = SDL_GetTicks() - start_time;
fps_sum += 1000.0 / (float)time_delta;
fps_count++;
if(fps_count >= fps_max_count)
{
printf("FPS: %f\n", fps_sum / (float)fps_count);
fps_count = 0;
fps_sum = 0.0;
}
if(time_delta < min_frame_time)
SDL_Delay(min_frame_time - time_delta);
But it seems like SDL_Delay somehow affects the return values of SDL_GetTicks, so time_delta gets values like 0 to 3, while it normally is about 15 when I only remove the last 2 lines.
For me, this makes no sense. Does anyone know what's wrong?
EDIT:
The code above is basically the main loop of my program. I first implemented an fps counter by saving the time before rendering the scene in start_time and afterwords calculating the average fps for multiple loops which worked fine.
Then, I added the last two lines to limit the fps to 60. So, if the rendering was faster than min_frame_time, the programm should wait the rest of the time. But after adding this, the results of SDL_GetTicks()
of all loops except for the first one became strange, so the delta became these small values as I mentioned above.