1

I'm new to OpenGL and trying to run some classic sample code from Edward Angel's famous book Interactive Computer Graphics. However, I've encountered a problem which puzzles me a lot but is probably very obvious to many others. The sample code is "Gasket.c" which can be downloaded from http://www.cs.unm.edu/~angel/BOOK/INTERACTIVE_COMPUTER_GRAPHICS/FIFTH_EDITION/PROGRAMS/CHAPTER02/

I've run this code with VS2010 Express on my Lenovo X60 (windows 7) and it has successfully drawn Sierspinski Gasket. However, it did not draw anything and only showed a blank (white) window when I was trying to run the same code on my desktop (Intel HD Graphics card) with same software systems. The file was complied/built with no problems and there were no errors/warnings.

Gasket.c is as follows

/* Two-Dimensional Sierpinski Gasket          */
/* Generated Using Randomly Selected Vertices */
/* And Bisection                              */

#ifdef __APPLE__
#include <GLUT/glut.h>
#else
#include <GL/glut.h>
#endif

void myinit()
{

/* attributes */

     glClearColor(1.0, 1.0, 1.0, 1.0); /* white background */
     glColor3f(1.0, 0.0, 0.0); /* draw in red */

/* set up viewing */
/* 500 x 500 window with origin lower left */

     glMatrixMode(GL_PROJECTION);
     glLoadIdentity();
     gluOrtho2D(0.0, 50.0, 0.0, 50.0);
     glMatrixMode(GL_MODELVIEW);
}

void display( void )
{
    GLfloat vertices[3][2]={{0.0,0.0},{25.0,50.0},{50.0,0.0}}; /* A triangle */

    int j, k;
    int rand();       /* standard random number generator */
    GLfloat p[2] ={7.5,5.0};  /* An arbitrary initial point inside traingle */

    glClear(GL_COLOR_BUFFER_BIT);  /*clear the window */


/* compute and plots 5000 new points */

        glBegin(GL_POINTS);

    for( k=0; k<5000; k++)
    {
         j=rand()%3; /* pick a vertex at random */


     /* Compute point halfway between selected vertex and old point */

         p[0] = (p[0]+vertices[j][0])/2.0; 
         p[1] = (p[1]+vertices[j][1])/2.0;

     /* plot new point */

        glVertex2fv(p); 

     }
     glEnd();
     glFlush(); /* clear buffers */
   }

void main(int argc, char** argv)
{

/* Standard GLUT initialization */

    glutInit(&argc,argv);
    glutInitDisplayMode (GLUT_SINGLE | GLUT_RGB); /* default, not needed */
    glutInitWindowSize(500,500); /* 500 x 500 pixel window */
    glutInitWindowPosition(0,0); /* place window top left on display */
    glutCreateWindow("Sierpinski Gasket"); /* window title */
    glutDisplayFunc(display); /* display callback invoked when window opened */

    myinit(); /* set attributes */

    glutMainLoop(); /* enter event loop */
}
genpfault
  • 51,148
  • 11
  • 85
  • 139
iyl
  • 13
  • 2
  • 2
    This isn't an answer but a useful tip - you're learning highly outdated OpenGL. If you are intent on actually learning modern graphics, stop now and learn the programmable pipeline instead. It is both faster and more flexible, although it has a steeper initial learning curve - but hey, who drives an automatic Lamborghini, right? Question-relevant tip: Try drawing one dot, or one triangle, then a few, and changing the colors around. That should help isolate the problem. – GraphicsMuncher Aug 20 '13 at 13:16
  • Yes, you're right. I still want to learn the programmable pipeline. I actually started with OpenGL tutorial 3.2+ but it couldn't run on my laptop (Lenovo X60), nor did it run on my desktop with Intel HD graphics card. Then I switched to OpenGL tutorial 2.1 for my desktop and it runs fine. But I'm stuck with my laptop :( I think I will have to update my laptop eventually although I still quite like it... – iyl Aug 21 '13 at 02:23

1 Answers1

1

Try switching from single-buffering (GLUT_SINGLE) to double-buffering (GLUT_DOUBLE):

#include <GL/glut.h>

void display( void )
{
    GLfloat vertices[3][2]={{0.0,0.0},{25.0,50.0},{50.0,0.0}}; /* A triangle */

    int j, k;
    int rand();       /* standard random number generator */
    GLfloat p[2] ={7.5,5.0};  /* An arbitrary initial point inside traingle */

    glClearColor(1.0, 1.0, 1.0, 1.0); /* white background */
    glClear(GL_COLOR_BUFFER_BIT);  /*clear the window */

    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();

    gluOrtho2D(0.0, 50.0, 0.0, 50.0);
    glMatrixMode(GL_MODELVIEW);
    glLoadIdentity();

    /* compute and plots 5000 new points */
    glColor3f(1.0, 0.0, 0.0); /* draw in red */
    glBegin(GL_POINTS);
    for( k=0; k<5000; k++)
    {
        j=rand()%3; /* pick a vertex at random */

        /* Compute point halfway between selected vertex and old point */
        p[0] = (p[0]+vertices[j][0])/2.0; 
        p[1] = (p[1]+vertices[j][1])/2.0;

        /* plot new point */
        glVertex2fv(p); 

    }
    glEnd();

    glutSwapBuffers();
}

void main(int argc, char** argv)
{
    glutInit(&argc,argv);
    glutInitDisplayMode (GLUT_DOUBLE | GLUT_RGB); /* default, not needed */
    glutInitWindowSize(500,500); /* 500 x 500 pixel window */
    glutInitWindowPosition(0,0); /* place window top left on display */
    glutCreateWindow("Sierpinski Gasket"); /* window title */
    glutDisplayFunc(display); /* display callback invoked when window opened */
    glutMainLoop(); /* enter event loop */
}
genpfault
  • 51,148
  • 11
  • 85
  • 139
  • Yes, it works! Can you explain why it doesn't need double-buffering on my old Lenovo X60 while I need to do this with intel HD graphics? Is there any condition that we should use single buffering or double buffering? – iyl Aug 21 '13 at 02:15
  • 1
    @iyl: I'd suspect on your newer system you're running an OS that does window composition. Compositors need the double buffer swap as a hint to draw the picture from the application to the screen. On a single buffered window the compositor either had to pull constantly, or hook some elementary graphics context functionality (on Windows usually BeginPaint / EndPaint and the WM_PAINT message). Usual OpenGL operation this totally misses. – datenwolf Aug 21 '13 at 12:10
  • @iyl: You could also experiment with [`glFinish()` instead of `glFlush()`](http://stackoverflow.com/a/2143752/44729). – genpfault Aug 21 '13 at 14:33
  • I tried and it didn't work. I think using doub-buffering and glutSwapBuffers is the only way for OpenGL codes to run properly on my desktop. @datenwolf: thanks for your explanation - it seems very convincing. How do I know if an OS is doing window composition? Is there any way in OpenGL to detect this automatically and set the right settings for it? – iyl Aug 23 '13 at 04:21
  • @iyl: OpenGL is OS agnostic and the way windows are drawn to the screen can vastly differ, depending on the OS, the graphics system run, and even the settings. But there are a few strong hints. Microsoft Windows supports Compositing since Windows Vista, using Aero; Compositing being possibly active is indicated by the presence of a process called `dwm.exe`. MacOS X always composites as of Tiger, with the introduction of Quartz Extreme. – datenwolf Aug 23 '13 at 10:29
  • @iyl: In X11 Composition has been supported by the protocol itself for some time, but actual compositors have emerged only later. Most well knowns are Compiz, Beryl. But also the standard window managers of Gnome-3 and KDE-4 do compositing. OTOH there's the XDamage extension, which allows to explicitly mark a window to require redraw, though the semantics with between compositors is not perfectly identical. OTOH since double buffering always was a bit weird in X11 most X11 compositors deal with single buffering just fine. – datenwolf Aug 23 '13 at 10:32