0

I'm not very experience with the OpenGL library so I'm having trouble understanding why when I move some initialization code to a class or a function, GL stops drawing onto the screen. Some research indicates that the library is "global" or state-based rather than object based?

Anyway, here is some code that works

GLuint vertexArrayBuffer;
glGenVertexArrays(1, &vertexArrayBuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexArrayBuffer);
// VBO is ready to accept vertex data
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
glBindVertexArray(0);    
while(!screen.isClosed()) {
    // Give the screen a background color
    screen.paint(0.0f, 0.0f, 0.5f, 1.0f);

    glBindBuffer(GL_ARRAY_BUFFER, vertexArrayBuffer);
    glEnableVertexAttribArray(0);
    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);
    glDrawArrays(GL_TRIANGLES, 0, 3); 
    glDisableVertexAttribArray(0);


    // Switch to display buffer after drawing all of the above   
    screen.swapBuffers();

This is all enclosed in the main function, with not much programming structure. The output is a nice white triangle onto a blueish background.

This is the issue here, taking the exact code prior to the event loop and wrapping it into a function:

GLuint initVertexArray(vertex vertices[]) {
// Create file descriptor for the VBO for use as reference to gl vertex functions
GLuint vertexArrayBuffer;
glGenVertexArrays(1, &vertexArrayBuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexArrayBuffer);
// VBO is ready to accept vertex data
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
glBindVertexArray(0);

return vertexArrayBuffer;
}

and calling it GLuint vertexArrayBuffer = initVertexArray(vertices); in the main function, produces no output of any kind, no errors either, just the same blueish background.

too honest for this site
  • 12,050
  • 4
  • 30
  • 52
Scholar
  • 293
  • 3
  • 8
  • 16
  • Does `glGetError` have anything to say? – Jongware Oct 01 '15 at 21:58
  • @Jongware glGetError() returns 0 at each gl call. – Scholar Oct 01 '15 at 22:09
  • Unrelated to your problem and just a piece of advice: OpenGL is nigh impossible to wrap into OOP if you want to cover each and every feature. Many people tried and failed. Over the past 15 years or so I tried to pull it of for a few times, but there's always something, that prevents you from covering everything. – datenwolf Oct 01 '15 at 23:49
  • @datenwolf Appreciate the solid advice, I certainly won't be trying to, just the basic setup would be nice to avoid the complexity. Any idea as to why it's so impossible from your journey? Fairly curious about this. – Scholar Oct 01 '15 at 23:52
  • The main problem is, that object ownership in OpenGL is a complex beast. Take textures for example. The naive approach would be a `class GLTexture`. Problem with that: texture objects belong to contexts. So using a texture is only possible if used with the very context it has been created with. How do you translate that? Naive approach would be for the context to hand out smart_ptr to the texture objects and somehow enfore the texture context relationship. This however raises the problem that a texture may actually be owned by several contexts which the previously introduced constraint breaks – datenwolf Oct 02 '15 at 01:03
  • I know it can be difficult to identify duplicates until you know what the problem is, but this is almost exactly the same as: http://stackoverflow.com/questions/26793266/weird-opengl-issue-when-factoring-out-code. – Reto Koradi Oct 02 '15 at 02:37

1 Answers1

3

Have you checked what sizeof(vertices) is returning. In this case vertices[] will decay into a pointer so I would imagine that sizeof(vertices) is sizeof(vertex*).

Try passing the size of the array alongside it like so:

GLuint initVertexArray(vertex vertices[], const unsigned int size);

Then you would use it like so:

glBufferData(GL_ARRAY_BUFFER, size, vertices, GL_STATIC_DRAW);

instead of:

glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);

You would then call it in the same scope as you declared your vertices array:

vertex vertices[100];
// sizeof(vertices) here will give the actual size of the vertices array
// eg: sizeof(vertex) * 100 instead of just giving sizeof(vertex*)
GLuint vertexArrayBuffer = initVertexArray(vertices, sizeof(vertices));
Mohamad Elghawi
  • 2,071
  • 10
  • 14
  • hardcoding 36 into the size fixed the problem it seems. Even though sizeof(vertices) returns 36. – Scholar Oct 01 '15 at 22:10
  • That's because of what I mentioned above in my answer. – Mohamad Elghawi Oct 01 '15 at 22:12
  • Did you check it within that function. I would be very surprised if it returned 36. – Mohamad Elghawi Oct 01 '15 at 22:13
  • calling it from main using initVertexArray(vertices, (sizeof(vertices))); works. You are right though, the sizeof operation yielded false results in the function space. that is the problem. – Scholar Oct 01 '15 at 22:15
  • 1
    As I said - the reason for that is that vertex vertices[] is the same as vertex* vertices and so sizeof(vertices) will return sizeof(vertex*) and not the actual size of the array. That's why you always need to pass the size of the array around with it or you can use std::vector which knows about it's size. – Mohamad Elghawi Oct 01 '15 at 22:20