3

I'm trying to learn openGL myself so I bought a book about openGL and in first chapter are example code so I try it and something went wrong. At line 17(glGenVertexArrays(NumVAOs, VAOs);) and 18(glBindVertexArray(VAOs[Triangles]);) is VS 2013 Ultimate report an error exactly "Unhandled exception at 0x77350309 in openGL_3.exe: 0xC0000005: Access violation executing location 0x00000000.". So I think it's something wrong with memory but i do not know what. Can someone help me?

#include <iostream>
using namespace std;

#include <vgl.h>
#include <LoadShaders.h>


enum VAO_IDs { Triangles, NumVAOs };
enum Buffer_IDs { ArrayBuffer, NumBuffers };
enum Attrib_IDs { vPosition = 0 };
GLuint VAOs[NumVAOs];
GLuint Buffers[NumBuffers];
const GLuint NumVertices = 6;

void init(void)
{
    glGenVertexArrays(NumVAOs, VAOs);
    glBindVertexArray(VAOs[Triangles]);
    GLfloat vertices[NumVertices][2] = {
        { -0.90, -0.90 }, // Triangle 1
        { 0.85, -0.90 },
        { -0.90, 0.85 },
        { 0.90, -0.85 }, // Triangle 2
        { 0.90, 0.90 },
        { -0.85, 0.90 }
    };
    glGenBuffers(NumBuffers, Buffers);
    glBindBuffer(GL_ARRAY_BUFFER, Buffers[ArrayBuffer]);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertices),
    vertices, GL_STATIC_DRAW);
    ShaderInfo shaders[] = {
        { GL_VERTEX_SHADER, "triangles.vert" },
        { GL_FRAGMENT_SHADER, "triangles.frag" },
        { GL_NONE, NULL }
    };
    GLuint program = LoadShaders(shaders);
    glUseProgram(program);
    glVertexAttribPointer(vPosition, 2, GL_FLOAT,
    GL_FALSE, 0, BUFFER_OFFSET(0));
    glEnableVertexAttribArray(vPosition);
}

void display(void)
{
    glClear(GL_COLOR_BUFFER_BIT);
    glBindVertexArray(VAOs[Triangles]);
    glDrawArrays(GL_TRIANGLES, 0, NumVertices);
    glFlush();
}

int main(int argc, char** argv)
{
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_RGBA);
    glutInitWindowSize(512, 512);
    glutInitContextVersion(4, 3);
    glutInitContextProfile(GLUT_CORE_PROFILE);
    glutCreateWindow(argv[0]);
    if (glewInit()) {
        cerr << "Unable to initialize GLEW ... exiting" << endl;
        exit(EXIT_FAILURE);
    }
    init();
    glutDisplayFunc(display);
    glutMainLoop();
}
genpfault
  • 51,148
  • 11
  • 85
  • 139
JAMI
  • 61
  • 1
  • 2
  • 6

3 Answers3

5

You can try setting:

glewExperimental = GL_TRUE;

before your call to glewInit(). Sources:

https://stackoverflow.com/a/20822876/833188

https://stackoverflow.com/a/22820845/833188

Sga
  • 3,608
  • 2
  • 36
  • 47
0

Are you sure, that you can request version 4.3 before call to glewInit()? Every version >= 3.0 requires WGL_ARB_create_context (Windows)/GLX_ARB_create_context (Linux), which is an extension.

Usually, to create "modern" OpenGL context (3.0+) it is required to:

  1. Create temporary context, set it as current.
  2. Initialize extensions (either manually or with loader like GLEW or GLFW).
  3. Request desired version (and profile, if you create version 3.2 or higher and WGL_ARB_create_context_profile/GLX_ARB_create_context_profile is present).
  4. Delete temporary context from 1. and bind your new context.

You may want to look at:

I do not know, what is the connection between requesting context version in GLUT and GLEW initialization (I often use GLEW, but creation of context is something, that I always do manually, with platform-specific API), but obviously, pointers to new OpenGL API are not initialized when you call glGenVertexArrays. That is the reason of your error - you try to call function via pointer, which is NULL.

Mateusz Grzejek
  • 11,698
  • 3
  • 32
  • 49
  • freeglut (and the whole `glutInitContextVersion` API is just a freeglut-specific extension not present in other GLUT versions) will hanlde the context creation in a platform-abstracted manner, just like GLFW, SDL or similiar libs. There is actually no prinicpal problem with that approach, besides trying to use `glewInit()` without `glewExperimental=GL_TRUE` in a core profile. – derhass May 05 '15 at 19:41
0

I generally agree to Sga's answer recommending to set glewExperimental=GL_TRUE before calling glewInit(). GLEW will fail to initialize in a core profile if this option is not set.

However, the fact that the glewInit() does not fail implies that the OP did not get a core profile at all (or that GLEW has finally been fixed, but that is more of a theoretical possibility).

I already had a look into freeglut's implementation of the (freeglut-specific) glutInitContextVersion API for the question "Where is the documentation for glutInitContextVersion?", and the conclusions from back then might be helpful in this case. To quote myself:

From looking into the code (for the current stable version 2.8.1), one will see that freeglut implements the following logic: If the implementation cannot fullfill the version constraints, but does support the ARB_create_context extension, it will generate some error and no context will be created. However, if a version is requested, but the implementation does not even support the relevant extensions, a legacy GL context is created, effectively ignoring the version request completely.

So from the reported behavior, I deduce that the OP did only get a legacy context, possibly even Microsofts default OpenGL 1.1 implementation.

This also expalins why glGenVertexArrays() is a NULL pointer even after glewInit() succeeded: The extension is not supported for this context.

You should check what glGetString(GL_VERSION) and glGetString(GL_RENDERER) actually return right after the glutCreateWindow() call. And depending on the output, you might consider checking/updating your graphics drivers, or you might learn that your GPU is just not capable of modern GL at all.

Community
  • 1
  • 1
derhass
  • 43,833
  • 2
  • 57
  • 78