3

I'm playing around with C++ on VS, using OpenGL for rending/moving shapes and Win32 for window display etc. (moving over from glut display)

How can I control Frame rate?

I see a lot of examples using dt and some form of frame refresh, but I'm unsure how to implement this... Is there something I can use as part of Win32 or can it be done an easier way?

Also, probably a silly question but, if nothing is implemented, what is the default frame rate? or is there none?

Reanimation
  • 3,151
  • 10
  • 50
  • 88
  • I wrote a short answer about limiting the framerate [here](http://stackoverflow.com/a/19677856/1888983). – jozxyqk Aug 08 '15 at 07:04

4 Answers4

3

I have some very old code lying about on my USB drive, which used old OpenGL and glut, the same principles for timing apply even in more modern versions, the draw code would be different though. The code is for imprecise timing, it is sufficient though for an illustration of how to roughly achieve a set FPS:

// throttle the drawing rate to a fixed FPS
//compile with: g++ yourfilenamehere.cpp -lGL -lglut
#include <cstdlib>
#include <iostream>

#include <GL/gl.h>
#include <GL/glut.h>

GLint FPS = 0;

void FPS(void) {
  static GLint frameCounter = 0;         // frames averaged over 1000mS
  static GLuint currentClock;             // [milliSeconds]
  static GLuint previousClock = 0; // [milliSeconds]
  static GLuint nextClock = 0;     // [milliSeconds]

  ++frameCounter;
  currentClock = glutGet(GLUT_ELAPSED_TIME); //has limited resolution, so average over 1000mS
  if ( currentClock < nextClock ) return;

  FPS = frameCounter/1; // store the averaged number of frames per second

  previousClock = currentClock;
  nextClock = currentClock+1000; // set the next clock to aim for as 1 second in the future (1000 ms)
  frameCounter=0;
}

void idle() {
  static GLuint previousClock=glutGet(GLUT_ELAPSED_TIME);
  static GLuint currentClock=glutGet(GLUT_ELAPSED_TIME);
  static GLfloat deltaT;

  currentClock = glutGet(GLUT_ELAPSED_TIME);
  deltaT=currentClock-previousClock;
  if (deltaT < 35) {return;} else {previousClock=currentClock;}

  // put your idle code here, and it will run at the designated fps (or as close as the machine can get

  printf(".");
  //end your idle code here

  FPS(); //only call once per frame loop 
  glutPostRedisplay();
}

void display() {
  glClearColor(0.0, 0.0, 0.0, 0.0);
  glClear(GL_COLOR_BUFFER_BIT);

  // Set the drawing color (RGB: WHITE)
  printf("FPS %d\n",FPS);
  glColor3f(1.0,1.0,1.0);

  glBegin(GL_LINE_STRIP); {
     glVertex3f(0.25,0.25,0.0);
     glVertex3f(0.75,0.25,0.0);
     glVertex3f(0.75,0.75,0.0);
     glVertex3f(0.25,0.75,0.0);
     glVertex3f(0.25,0.25,0.0);
  }
  glEnd(); 

  glutSwapBuffers();
}

void init() {
   glMatrixMode(GL_PROJECTION);
   glLoadIdentity();
   glOrtho(0.0,1.0,0.0,1.0,-1.0,1.0); 
}

void keyboard(unsigned char key, int x, int y)
{
   switch (key) {
      case 27:  // escape key
         exit(0);
         break;
      default:
         break;
   }
}

int main(int argc, char** argv) {
   glutInit(&amp;argc, argv);
   glutInitDisplayMode (GLUT_DOUBLE | GLUT_RGB);
   glutCreateWindow("FPS test");

   glutIdleFunc(idle);
   glutDisplayFunc(display);
   glutKeyboardFunc(keyboard);

   init();

   glutMainLoop();
   return 0;
}

Hope this helps a bit :) let me know if you need more information.

GMasucci
  • 2,834
  • 22
  • 42
2

If you don't plan to build a multi-user system, you don't need to fix frame rate (you can measure time spent last frame using QueryPerformanceCounter and assume the next will take approximately as long). Then you only move objects according to how frame time.

If you apply force/acceleration in this model you may need to compensate, for instance using velocity Verlet integration.

Fixing frame rate can however get a bit messy, especially if your CPU/GPU load per frame varies a lot.

Simple version for fixed frame-rate:

If you're just after smooth movement on your machine and want fixed frame-rate, implement this pseudo code:

fps = 30    # Pick something good for you, 30 and 60 are common values.
main_loop:
    t0 = time()
    update_and_render(1/fps)
    t1 = time()
    frame_time = t1-t0
    sleep(1/fps - frame_time)
    goto main_loop

frame_time in this example is what you're examples are referring to as dt ("delta time"), but most examples won't have fixed frame-rate. Instead they move your sprites according to how long last frame took.

The same for varying frame-rate:

last_frame_rate = 1/100
main_loop:
    t0 = time()
    update_and_render(last_frame_rate)
    t1 = time()
    last_frame_rate = t1-t0
    goto main_loop
Community
  • 1
  • 1
Jonas Byström
  • 25,316
  • 23
  • 100
  • 147
  • Interesting... I shall give those links a look/read. The only reason I'm looking into is is because even though I move my shapes by `0.01f` each time, sometimes it moves faster than others and I put this down to a fluctuating frame rate?? Hence wondering if I should fix it. Thanks for posting. – Reanimation Nov 26 '13 at 14:25
  • Yes, your frame rate will differ between machines, and rightly so (on a fast computer, the animation should be smoother). If you feel like it, you can add some tiny sleep in your main loop to not consume all CPU you can, this makes the rest of the system more responsive. – Jonas Byström Nov 26 '13 at 14:51
  • The fluctuation is when I'm on the same program on the same system (change nothing). For example, If I run the program now, it might be smooth, but if I was to run it in an hour, it might respond slower or faster... I don't know if it's because I'm executing it in VS on windows7 on a Macbook pro, dual booted... Maybe on configs are different or something... I was always under the impression it would vary from system to system, but not differ between executions... – Reanimation Nov 26 '13 at 15:09
  • It all depends on what else you're running as well. If you're hosting a web server on the same machine, or some antivirus is checking your drive, or the operating system has scheduled some task, or a browser is downloading something in the background, or... you get the picture. – Jonas Byström Nov 26 '13 at 15:14
  • Ah ok. I see. I'll see if I can implement a small framerate checker or something. Thanks. – Reanimation Nov 26 '13 at 15:32
  • You can't measure rendering time with before and after draw call measurements like that because the GPU is asynchronous. See [this](http://stackoverflow.com/a/19677856/1888983). – jozxyqk Aug 08 '15 at 07:01
1

To answer your question: The frame rate depends on the performance of your graphics card.

dt means delta time that is the time since the last frame or update. Depending on what you want to achieve u can measure dt between two frames and use this update the position of your objects.

You can control the framerate by time measures: If you want to force 30FPS and you know that rt it the time needed to render the frame then you can render a frame at time zero, and then render a frame at time 1/30 - rt and so on.

In a conceptual cleaner model you would separate the rendering from the update of the data model, such that you use a fix dt like 1/30 seconds to update your positions etc. In this model the rendering runs as often as possible, therefore you store the last two positions. A interpolation parameter between 0-1 can then be used by the rendering frame function to interpolate the positions. Using this model you get several advantages:

  • You can simulate without rendering
  • You get a deterministic simulation for your data because your dt is fix
  • dt can be used to implement a slow-motion or double-fast playback of the rendering easily.
DarthB
  • 351
  • 2
  • 6
  • When I run my program, I am able to move the cube by `0.01f` and sometimes it moves faster than other times even though nothing in the program has changed, I can only assume this is due to the frame rate fluctuating... That's why I was thinking to set the frame rate... Would you agree? Or am I confusing this strange behaviour with something else? – Reanimation Nov 26 '13 at 14:22
  • Yes I think you are right but instead of trying to force a frame-rate just try to get the time between the frames *dt* and move the cube by 0.1f*dt that means the movement depends on the time the program needed to render the frame – DarthB Nov 27 '13 at 12:41
1

using time_t or clock_t will usually give 0 delta instead use the precise windows profile.h API

#include<profil.h>

#define FRAMES_PER_SEC 60.0

void tickfbs();

LARGE_INTEGER frequency;        // used for ticks per second
LARGE_INTEGER t1, t2;           // used for storing ticks passed
double elapsedTime;

int main(void){
    QueryPerformanceFrequency(&frequency);
    for(;;){
        //do something(drawing)
        tickfbs();//wait for the delta time between each iteration and the fbs ratio
    }
    return 0;
}

//this will wait for the amount of delta time between this call of the function and the last call
void tickfbs(){
    QueryPerformanceCounter(&t2);
    elapsedTime = (t2.QuadPart - t1.QuadPart) / frequency.QuadPart * 1000;
    if(elapsedTime < FRAMES_PER_SEC){
        delay(FRAMES_PER_SEC-elapsedTime);
    }
    t1 = t2;
}
Eboubaker
  • 618
  • 7
  • 15