-1

I'm attempting to incorporate a path finding algorithm I made into code but I'm running into a problem. I am trying to be flexible with my code and allow data sets of different lengths and then draw the points using openGL. My problem is that for the points I am using an array of pointers to accomplish the variable length and openGL doesn't like that when trying to convert data types. With the function glVertex2i() it wants GLint as its two parameters but when I try and convert my array to GLint I get a blank window. I understand this is a typedef but it wont take the regular int from the array. Please help!

struct Points { int x, y; }; //My struct to hold the x,y cords
int size; //This is the size of the array
Points *crds = new Points[size]; //The data for this array was input in another function

for (int i = 0; i < size; i++)
{
    //These are some things to help configure the look of the points
    glEnable(GL_POINT_SMOOTH);
    glPointSize(100);
    glColor3f(250, 250, 250);
    glBegin(GL_POINTS);
    for (int i = 0; i < size; i++)
    {
        glVertex2i((GLint)crds[i].x, (GLint)crds[i].y);
    }
    glEnd();
Spektre
  • 49,595
  • 11
  • 110
  • 380
  • Why do you have two nested `for` loops, both with a loop counter `i`? What is the outer loop meant to do? – alter_igel Jul 12 '18 at 05:10
  • I don't think the problem is in the conversion between int and GLint. Are you sure the data is valid (size isn't zero, or the vertices aren't all the same)? – Taylor Nichols Jul 12 '18 at 05:21

1 Answers1

0
  1. use GLint for the coords

    because int is not guaranteed to be 32bit it differs from compiler to compiler and platform and can be 16/32/64 bit these days. Your solution should work too but if you use GLint then You do not need to cast the (GLint) in glVertex and also can use the vector version like this:

    GLint pnt[100][2];
    glVertex2iv(pnt[i]);
    

    or like this:

    GLint pnt[100<<1];
    glVertex2iv(pnt[i<<1]);
    

    But the real problem lies in following bullets...

  2. matrices

    we do not see any matrices nor the range of your points. The OpenGL uses unit matrices by default which means your points should be <-1,+1> to be visible which is not practical on integers. So if your points are in pixels and your screen is xs,ys resolution you should add this before your render:

    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();
    glMatrixMode(GL_MODELVIEW);
    glLoadIdentity();
    glTranslatef(-1.0,-1.0,0.0);
    glScalef(2.0/float(xs),2.0/float(ys),0.0);
    
  3. glColor3f(250,250,250)

    The floating point range of colors in OpenGL is <0.0,1.0> so you are setting wrong colors. Try this instead:

    glColor3f(1.0,1.0,1.0);
    
  4. glPointSize(100)

    100 is too big as the size is in pixels try:

    glPointSize(8);
    

That is all I can think of what could be wrong... Look here:

the related QA contains working example for both old and new API.

Spektre
  • 49,595
  • 11
  • 110
  • 380