6

I have a sdl/opengl game I am working on for fun. I get a decent fps on average, but movement is really choppy because SDL_GL_SwapBuffers() will randomly take a crazy long amount of time to process. With textures loaded and written to the buffer sometimes it will take over 100ms! I cut out a lot of my code to try and figure out if it was something I did wrong but I haven't had much luck. When I run this bare bones program it will still block for up to 70ms at times.

Main:

// Don't forget to link to opengl32, glu32, SDL_image.lib

// includes
#include <stdio.h>

// SDL
#include <cstdlib>
#include <SDL/SDL.h>

// Video
#include "videoengine.h"

int main(int argc, char *argv[])
{
    // begin SDL
    if ( SDL_Init(SDL_INIT_VIDEO) != 0 )
    {
        printf("Unable to initialize SDL: %s\n", SDL_GetError());
    }

    // begin video class
    VideoEngine videoEngine;

    // BEGIN MAIN LOOP
    bool done = false;
    while (!done)
    {
        int loopStart = SDL_GetTicks();

        printf("STARTING SWAP BUFFER : %d\n", SDL_GetTicks() - loopStart);
        SDL_GL_SwapBuffers();


        int total = SDL_GetTicks() - loopStart;
        if (total > 6)
            printf("END LOOP  : %d ------------------------------------------------------------>\n", total);
        else
             printf("END LOOP  : %d\n", total);

    }
    // END MAIN LOOP

    return 0;
}

My "VideoEngine" constructor:

    VideoEngine::VideoEngine()
{
    UNIT = 16;
    SCREEN_X = 320;
    SCREEN_Y = 240;
    SCALE = 1;


    // Begin Initalization

        SDL_Surface *screen;

        SDL_GL_SetAttribute( SDL_GL_DOUBLEBUFFER, 1 );  // [!] SDL_GL_SetAttributes must be done BEFORE SDL_SetVideoMode

        screen = SDL_SetVideoMode( SCALE*SCREEN_X, SCALE*SCREEN_Y, 16, SDL_OPENGL );  // Set screen to the window with opengl
        if ( !screen )  // make sure the window was created
        {
            printf("Unable to set video mode: %s\n", SDL_GetError());
        }

        // set opengl state
        opengl_init();

    // End Initalization

}

void VideoEngine::opengl_init()
{
    // Set the OpenGL state after creating the context with SDL_SetVideoMode

        //glClearColor( 0, 0, 0, 0 );                             // sets screen buffer to black
        //glClearDepth(1.0f);                                     // Tells OpenGL what value to reset the depth buffer when it is cleared
        glViewport( 0, 0, SCALE*SCREEN_X, SCALE*SCREEN_Y );     // sets the viewport to the default resolution (SCREEN_X x SCREEN_Y) multiplied by SCALE. (x,y,w,h)
        glMatrixMode( GL_PROJECTION );                          // Applies subsequent matrix operations to the projection matrix stack.
        glLoadIdentity();                                       // Replaces the current matrix with the identity matrix
        glOrtho( 0, SCALE*SCREEN_X, SCALE*SCREEN_Y, 0, -1, 1 ); //describes a transformation that produces a parallel projection
        glMatrixMode( GL_MODELVIEW );                           // Applies subsequent matrix operations to the projection matrix stack.
        glEnable(GL_TEXTURE_2D);                                // Need this to display a texture
        glLoadIdentity();                                       // Replaces the current matrix with the identity matrix
        glEnable(GL_BLEND);                                     // Enable blending for transparency
        glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);      // Specifies pixel arithmetic
        //glDisable( GL_LIGHTING );                               // Disable lighting
        //glDisable( GL_DITHER );                                 // Disable dithering
        //glDisable( GL_DEPTH_TEST );                             // Disable depth testing

        //Check for error
        GLenum error = glGetError();
        if( error != GL_NO_ERROR )
        {
         printf( "Error initializing OpenGL! %s\n", gluErrorString( error ) );
        }

    return;
}

I'm starting to think possibly I have a hardware issue? I have never had this problem with a game though.

Alden
  • 2,229
  • 1
  • 15
  • 21
  • Do you have any vsync enabled? – Tim Oct 17 '12 at 18:04
  • No I don't think so after reading this -http://stackoverflow.com/questions/589064/how-to-enable-vertical-sync-in-opengl I think what may be going on is calling SwapBuffers() without giving it time to process may cause it to hang or loop. I will do some more reading and see if I can't find something to support that. – Alden Oct 17 '12 at 20:17
  • 2
    [Toss](http://www.msarnoff.org/sdb/) a `SDL_Delay(1)` after the buffer swap, see what it does to your frame times. – genpfault Oct 17 '12 at 20:38
  • Not perfect, but significantly improved. I also noticed that part of the jitter is because when the screen moved one frame was duplicated. Thanks! – Alden Oct 17 '12 at 20:48

1 Answers1

2

SDL does use the SwapIntervalEXT extension so you can make sure that the buffer swaps are as fast as possible (VSYNC disabled). Also, buffer swap is not a simple operation, OpenGL needs to copy contents of back buffers to front buffers for case that you want to glReadPixels(). This behavior can be controlled using WGL_ARB_pixel_format, using WGL_SWAP_EXCHANGE_ARB (you can read about all this stuff in the specs; now I'm not sure if there is an alternative to that for Linux).

And then on top of all that, there is the windowing system. That can actually cause a lot of trouble. Also, if some errors are generated ...

This behavior is probably ok if you're running on a small mobile GPU.

SDL_GL_SwapBuffers() only contains a call to glxSwapBuffers() / wglSwapBuffers() so there is no time spent in there.

the swine
  • 10,713
  • 7
  • 58
  • 100
  • 3
    It's named "swap" buffers precisely because no sane implementation actually performs a copy. – Ben Voigt Oct 18 '14 at 16:23
  • @BenVoigt Well, apparently WGL does - read [`WGL_SWAP_EXCHANGE_ARB` related docs](http://oss.sgi.com/projects/ogl-sample/registry/ARB/wgl_pixel_format.txt) (I'm not contesting your comment about sanity, though). The reason behind this is to be able to read the contents of the buffer even after swap (not sure why someone wanted to do that), and that is the default behavior. By enabling `WGL_SWAP_EXCHANGE_ARB` you acknowledge that you understand that after swap, the buffers swap and you can't get to your data anymore. – the swine Oct 19 '14 at 08:10
  • Are you sure you don't have to explicitly request having valid data in the buffer? That `WGL_SWAP_METHOD_ARB` parameter has three options, and it doesn't make clear which is the default. – Ben Voigt Oct 19 '14 at 16:50
  • @BenVoigt you're right, looking at it, it does not. However, it was in some optimization guide from NVIDIA, they stated pretty clear that it needs to be set explicitly. – the swine Oct 20 '14 at 07:49