0

I'm upgrading an OpenGL / Qt to OSX Lion and having some new errors which I need help solving. I'm getting GL_FRAMEBUFFER_UNDEFINED on GLClear inexplicably.

I've read all the possible causes of this, and nothing seems to match. To help track the problem down, I added the following two lines to some common check code we use:

glCheckFramebufferStatus(GL_FRAMEBUFFER);
glCheckFramebufferStatusEXT(GL_FRAMEBUFFER);

Here's the OpenGL Trace, first the beginning of the application:

1: 0x01021b06 glGenBuffers(1, 0x11c461c0); 
2: 0x01021b06 glGenBuffers(1, 0x11c4616c);  
3: 0x01021b06 glGenBuffers(1, 0x11c46118); 
4: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
5: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
6: 0x01021b06 glGetError(); returns: GL_NO_ERROR  
7: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
8: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
9: 0x01021b06 glGetError(); returns: GL_NO_ERROR  
10: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
11: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED

As you can see, BOTH calls end up being glCheckFramebufferStatusEXT, which I assume is something OSX is doing. I'm not sure why there's NO framebuffer, ever, though.

Now, here's the trace at the time of the error:

29842: 0x01021b06 glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_ONE); 
29843: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
29844: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
29845: 0x01021b06 glGetError(); returns: GL_NO_ERROR  
29846: 0x01021b06 glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 32, 32, GL_RGBA, GL_UNSIGNED_SHORT_5_5_5_1, 0x11c3d010); 
29847: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
29848: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
29849: 0x01021b06 glGetError(); returns: GL_NO_ERROR  
29850: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
29851: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
29852: 0x01021b06 glGetError(); returns: GL_NO_ERROR  
29853: 0x01021b06 glGetError(); returns: GL_NO_ERROR  
29854: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
29855: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
29856: 0x01021b06 glGetError(); returns: GL_NO_ERROR  
29857: 0x01021b06 glClearColor(0, 0, 0, 1); 
29858: 0x01021b06 glGetError(); returns: GL_NO_ERROR  
29859: 0x01021b06 glCheckFramebufferStatusEXT(GL_FRAMEBUFFER); returns: GL_FRAMEBUFFER_UNDEFINED  
29860: 0x01021b06 glClear(GL_COLOR_BUFFER_BIT); 
29861: 0x01021b06 glGetError(); returns: GL_INVALID_FRAMEBUFFER_OPERATION

As you can see, we get GL_NO_ERROR, right up to the GlClear, which fails.

I'm not sure how to resolve this problem - what information should I be gathering to resolve it?

genpfault
  • 51,148
  • 11
  • 85
  • 139
Michael Wilson
  • 800
  • 8
  • 15
  • Are you attempting to use a frame buffer here? – Robinson Apr 03 '12 at 06:09
  • Have you bond the framebuffer somewhere? `glGenBuffers` generates name for index and vertex buffers, not for framebuffer – crazyjul Apr 03 '12 at 15:06
  • Prior to OSX Lion, the code "just worked", so my assumption has been that a Frame Buffer was automatically created and bound. That may be a false assumption. The other thing that may be important is that we've gone from Carbon-backed Qt to Cocoa backed Qt since Carbon backed Qt is no longer supported. We may need to do some additional setup because of that. – Michael Wilson Apr 03 '12 at 23:33
  • I've modified the code to include the following right after aglCreateContext: GLuint fb; glGenFramebuffers(1, &fb); glBindFramebuffer(GL_FRAMEBUFFER, fb); It still fails in exactly the same way. – Michael Wilson Apr 08 '12 at 06:03
  • Did you figure out how to make this work? Having the same problem here. – V1ru8 Apr 18 '12 at 08:40
  • Partially. The issue is that there's no drawable bound to the AglContext. So, I have switched to Cocoa (CglContext), wrap it in a NSOpenGlContext, and use setView to bind a NSView to it: ` NSOpenGLContext *nsCtx; nsCtx = [[NSOpenGLContext alloc] initWithCGLContextObj:ctx]; [nsCtx retain]; [nsCtx setView:(NSView *)view]; ` This eliminated the errors. The drawing isn't showing (yet), but it's better than it was. – Michael Wilson Apr 28 '12 at 04:53

5 Answers5

3

I was seeing this bug as well when using NSOpenGLView in OS X 10.8, but I found a workaround. If your glCheckFrameBufferStatus is returning GL_FRAMEBUFFER_UNDEFINED, that means that the frame buffer is not instantiated. Make sure you are calling [[self openGLContext]setView:self] by the way, to make sure that the context has a drawable region associated with it.

My discovery was that if any of the NSOpenGLView ancestors either (1) have layers (i.e. are layer backed or layer hosted via setWantsLayer) or (2) initialized with Interface Builder, then the NSOpenGLView framebuffer won't get initialized! I had to modify my OpenGLView and its ancestors to not use layers and not get alloc'd by interface builder in order to have my programmatically instantiated NSOpenGLView work. (By programmatically instantiated, I mean that the openGLView was created using [[NSOpenGLVIew alloc]init..].)

Amber Haq Dixon
  • 614
  • 7
  • 9
2

You're absolutely right. The Apple glFullScreen gets a glGetError right after glClear. Happens once to at start, and once again flipping from windowed to full screen.

Aitul
  • 2,982
  • 2
  • 24
  • 52
1

This seems to be caused by an issue with the GL implementation shipped with Mac OS X. Apparently the framebuffer isn't ready for use at the time of the glClear call.

http://lists.apple.com/archives/mac-opengl/2012/Jul/msg00038.html

It's been filed as a bug on the apple bug tracker, radar # 11974524.

I've found that everything works correctly if you call glGetError() once after the glClear, this resets the OpenGL error flag.

JS.
  • 616
  • 4
  • 13
1

Note: the below only applies to 10.9 (and maybe 10.8, I haven't tested). For newer osx versions, the only really supported way to draw OpenGL content is using CAOpenGLLayer (NSOpenGLView technically still works, but has bugs like vsync not working, render glitches, etc.)

With the above aside, it seems @Amber Dixon was right on the money with regard to this being an issue with layer-backed OpenGL views [on 10.9(-10.8?)]. There are a few ways we can create them, some more bugged than others, so let's go through them.

Below I have a snippet of a simple OpenGL example to draw a triangle, subclassing NSOpenGLView. You will note that it is layer-backed, and on 10.9.5, it works exactly like it should.

#import <Cocoa/Cocoa.h>
#import <OpenGL/gl.h>
#import <OpenGL/glu.h>
#import <QuartzCore/QuartzCore.h>
#import <OpenGL/gl3.h>

@interface MyOpenGLView : NSOpenGLView
@end

@implementation MyOpenGLView

+ (BOOL)canDrawOpenGL {
    return YES;
}

- (void)prepareOpenGL {
    // Set up OpenGL context and other properties
    [super prepareOpenGL];
    [[self openGLContext] makeCurrentContext];
    glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
}

- (void)drawRect:(NSRect)dirtyRect {
    // Clear the view with the clear color
    glClear(GL_COLOR_BUFFER_BIT);
    
    // Set up the vertex data for the triangle
    GLfloat vertices[] = {
        -0.5f, -0.5f, 0.0f,
        0.5f, -0.5f, 0.0f,
        0.0f,  0.5f, 0.0f
    };
    
    // Set up the vertex buffer object (VBO)
    GLuint vbo;
    glGenBuffers(1, &vbo);
    glBindBuffer(GL_ARRAY_BUFFER, vbo);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
    
    // Set up the vertex shader
    const GLchar* vertexSource =
    "#version 150\n"
    "in vec3 position;"
    "void main() {"
    "   gl_Position = vec4(position, 1.0);"
    "}";
    GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);
    glShaderSource(vertexShader, 1, &vertexSource, NULL);
    glCompileShader(vertexShader);
    
    // Set up the fragment shader
    const GLchar* fragmentSource =
    "#version 150\n"
    "out vec4 outColor;"
    "void main() {"
    "   outColor = vec4(1.0, 1.0, 1.0, 1.0);"
    "}";
    GLuint fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
    glShaderSource(fragmentShader, 1, &fragmentSource, NULL);
    glCompileShader(fragmentShader);
    
    // Set up the shader program
    GLuint shaderProgram = glCreateProgram();
    glAttachShader(shaderProgram, vertexShader);
    glAttachShader(shaderProgram, fragmentShader);
    glBindFragDataLocation(shaderProgram, 0, "outColor");
    glLinkProgram(shaderProgram);
    glUseProgram(shaderProgram);
    
    // Set up the vertex array object (VAO)
    GLuint vao;
    glGenVertexArrays(1, &vao);
    glBindVertexArray(vao);
    glEnableVertexAttribArray(0);
    glBindBuffer(GL_ARRAY_BUFFER, vbo);
    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, NULL);
    
    // Draw the triangle
    glDrawArrays(GL_TRIANGLES, 0, 3);
    
    // Clean up
    glDeleteProgram(shaderProgram);
    glDeleteShader(fragmentShader);
    glDeleteShader(vertexShader);
    glDeleteBuffers(1, &vbo);
    glDeleteVertexArrays(1, &vao);
    
    // Swap the buffers to display the rendered image
    [[self openGLContext] flushBuffer];
}

@end

int main(int argc, const char * argv[]) {
    // Set up the application and window
    @autoreleasepool {
        NSApplication *app = [NSApplication sharedApplication];
        NSWindow *window = [[NSWindow alloc] initWithContentRect:NSMakeRect(0, 0, 400, 400)
            styleMask:NSTitledWindowMask | NSClosableWindowMask | NSMiniaturizableWindowMask
            backing:NSBackingStoreBuffered
            defer:NO];
        
        // Enable layer-backing for the window
        [window setBackingType:NSBackingStoreBuffered];
        
        // Create an NSOpenGLPixelFormat object with double buffering enabled
        NSOpenGLPixelFormatAttribute attribs[] = {
            NSOpenGLPFADoubleBuffer,
            NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
            0
        };
        NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:attribs];
        
        // Create an NSOpenGLContext object with the pixel format
        NSOpenGLContext *glContext = [[NSOpenGLContext alloc] initWithFormat:pixelFormat shareContext:nil];
        
        // Set the pixel format and context for the view
        MyOpenGLView *glView = [[MyOpenGLView alloc] initWithFrame:NSMakeRect(0, 0, 400, 400) pixelFormat:pixelFormat];
        [glView setOpenGLContext:glContext];
        
        // Create a CAOpenGLLayer instance and set it as the layer for the view
    
        [glView setWantsLayer: YES];
        [glView.layer setBackgroundColor:[NSColor blueColor].CGColor];
        
        // Set up the window and run the application
        [window setContentView:glView];
        [window makeFirstResponder:glView];
        [window makeKeyAndOrderFront:nil];
        [app run];

        
    }
    return 0;
}

However, there are some interesting points to note. First, note that NSOpenGLView has an setOpenGLContext: method, and a way to set the pixel format (via setPixelFormat: or in the constructor). You might wonder why this is even necessary, since the latter can be derived from the former. In fact, it turns out to be a subtle pitfall. If you change the line to such

MyOpenGLView *glView = [[MyOpenGLView alloc] initWithFrame:NSMakeRect(0, 0, 400, 400)];

now not passing in the pixelFormat (but still later explicitly setting openGLContext), you'll find that it fails with setWantsLayer: YES but works fine when setWantsLayer: NO.

It turns out that when you request a layer-backed context, NSOpenGLView actually creates a brand new context from the pixel-format (using a default pixformat if none was set), rather than obeying the one you passed in. This can be seen from the disassembly

void * -[NSOpenGLView openGLContext](void * self, void * _cmd) {
    rbx = self;
    if ([self _isLayerBacked] != 0x0) {
            rax = [rbx _layerBackedOpenGLContext];
    }
    else {
            rax = rbx->_openGLContext;
            if (rax == 0x0) {
                    r14 = [NSOpenGLContext alloc];
                    rax = rbx->_pixelFormat;
                    if (rax == 0x0) {
                            rax = [rbx class];
                            rax = [rax defaultPixelFormat];
                    }
                    objc_assign_ivar([r14 initWithFormat:rax shareContext:0x0], rbx, *_OBJC_IVAR_$_NSOpenGLView._openGLContext);
                    [rbx->_openGLContext makeCurrentContext];
                    rax = rbx->_openGLContext;
            }
    }
    return rax;
}

That is, a view has a separate _layerBackedOpenGLContext: and a _surfaceBackedOpenGLContext, and whenever setWantsLayer is toggled, a brand new new context is created and assigned.

Another issue is seen in the following snippet. Now instead of subclassing NSOpenGLView, let's just subclass NSView directly and bind the NSOpenGLContext ourselves. According to apple, there's nothing that indicates that NSOpenGLView is anything "special" other than just a convenience wrapper around NSView that takes care of initializing and managing the context, so you'd expect it to be straightforward:

#import <Cocoa/Cocoa.h>
#import <OpenGL/gl.h>
#import <OpenGL/glu.h>
#import <QuartzCore/QuartzCore.h>
#import <OpenGL/gl3.h>

@interface MyOpenGLView : NSView
@property (nonatomic, strong) NSOpenGLContext *openGLContext;
@end

@implementation MyOpenGLView

+ (BOOL)canDrawOpenGL {
    return YES;
}

- (void)prepareOpenGL {
    // Set up OpenGL context and other properties

    [[self openGLContext] makeCurrentContext];
    glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
}

- (void)drawRect:(NSRect)dirtyRect {
    
    [[self openGLContext] setView: self];
    [[self openGLContext] makeCurrentContext];
    
    // Same calls as before, elided for brevity
    
    // Swap the buffers to display the rendered image
    [[self openGLContext] flushBuffer];
}

@end

int main(int argc, const char * argv[]) {
    // Set up the application and window
    @autoreleasepool {
        NSApplication *app = [NSApplication sharedApplication];
        NSWindow *window = [[NSWindow alloc] initWithContentRect:NSMakeRect(0, 0, 400, 400)
            styleMask:NSTitledWindowMask | NSClosableWindowMask | NSMiniaturizableWindowMask
            backing:NSBackingStoreBuffered
            defer:NO];
        
        // Enable layer-backing for the window
        [window setBackingType:NSBackingStoreBuffered];
        
        // Create an NSOpenGLPixelFormat object with double buffering enabled
        NSOpenGLPixelFormatAttribute attribs[] = {
            NSOpenGLPFADoubleBuffer,
            NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
            0
        };
        NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:attribs];
        
        // Create an NSOpenGLContext object with the pixel format
        NSOpenGLContext *glContext = [[NSOpenGLContext alloc] initWithFormat:pixelFormat shareContext:nil];
        
        // Set the pixel format and context for the view
        MyOpenGLView *glView = [[MyOpenGLView alloc] initWithFrame:NSMakeRect(0, 0, 400, 400)];
        [glView setOpenGLContext:glContext];
        
    
        [glView setWantsLayer: YES];
        [glView.layer setBackgroundColor:[NSColor blueColor].CGColor];
        
        // Set up the window and run the application
        [window setContentView:glView];
        [window makeFirstResponder:glView];
        [window makeKeyAndOrderFront:nil];
        [app run];

        
    }
    return 0;
}

However when you run this (on 10.9), you'll find that it does not work at all. But setting setWantsLayer: NO does work! Even more interestingly, it runs fine on newer osx versions, so the code itself probably isn't wrong. And when you print out the value of [self _layerBackedOpenGLContext] from drawRectangle you find that the context is indeed being set correctly. So what gives?

Comparing the results of [self layer] in both examples gives us a hint as to the answer: when we subclass NSOpenGLView, the layer is set to an NSOpenGLLayer. But in the latter, the layer remains the default NSView layer. (It might also interest you to note that on newer osx versions, the backing layer for both is an NSCGLSurface as described here). So it seems that either intentionally or by mistake, when you create your own layer-backed NSView with a bound OpenGLContext, the context is never actually bound to a layer, which might explain the error you saw.


Edit: So it turns out if you want to reimplement a layer-backed NSOpenGLView yourself and do it properly, you're responsible for providing the backing CALayer (as mentioned, it works even if you don't do this on 10.10+, but it's undefined behavior). And moreover, you cannot set your own context, you have to use the context the layer gives you.

#import <Cocoa/Cocoa.h>
#import <OpenGL/gl.h>
#import <OpenGL/glu.h>
#import <QuartzCore/QuartzCore.h>
#import <OpenGL/gl3.h>

@interface MyLayer : NSOpenGLLayer
@property (atomic, strong) NSOpenGLContext *myContext;
@end

@implementation MyLayer
- (BOOL)canDrawInCGLContext:(CGLContextObj)ctx pixelFormat:(CGLPixelFormatObj)pf
        forLayerTime:(CFTimeInterval)t displayTime:(const CVTimeStamp *)ts
{
    return YES;
}
- (NSOpenGLPixelFormat *)openGLPixelFormatForDisplayMask:(uint32_t)mask {
    return [self openGLPixelFormat];
}

-(NSOpenGLPixelFormat*) openGLPixelFormat {
    return [[NSOpenGLPixelFormat alloc] initWithCGLPixelFormatObj: CGLGetPixelFormat([self.myContext CGLContextObj])];
}

@end

@interface MyOpenGLView : NSView
@property (atomic, strong) MyLayer *myLayer;
@end

@implementation MyOpenGLView

+ (BOOL)canDrawOpenGL {
    return YES;
}

- (MyLayer *)makeBackingLayer {
    self.myLayer = [[MyLayer alloc] init];
    self.myLayer.view = self;
    self.myLayer.asynchronous = NO;
    return self.myLayer;
}

- (void)prepareOpenGL {
    // Set up OpenGL context and other properties

    [[self.myLayer openGLContext] makeCurrentContext];
    glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
}

- (void)drawRect:(NSRect)dirtyRect {
//  self.layer.openGLContext = self.openGLContext;
    [[self.myLayer openGLContext] setView: self];
    [[self.myLayer openGLContext] makeCurrentContext];
    
    NSLog(@"%@", self.myLayer);
    NSLog(@"%@", self.layer);
    NSLog(@"%@", [self.myLayer openGLContext]);
    NSLog(@"%@", [self.myLayer myContext]);
    // <snip triangle>
    glFlush();
    
    
    // Swap the buffers to display the rendered image
    [[self.myLayer openGLContext] flushBuffer];
    [self.myLayer setNeedsDisplay];
    [self setNeedsDisplay: YES];
}

@end

int main(int argc, const char * argv[]) {
    // Set up the application and window
    @autoreleasepool {
        NSApplication *app = [NSApplication sharedApplication];
        NSWindow *window = [[NSWindow alloc] initWithContentRect:NSMakeRect(0, 0, 400, 400)
            styleMask:NSTitledWindowMask | NSClosableWindowMask | NSMiniaturizableWindowMask
            backing:NSBackingStoreBuffered
            defer:NO];
        
        // Enable layer-backing for the window
        [window setBackingType:NSBackingStoreBuffered];
        
        // Create an NSOpenGLPixelFormat object with double buffering enabled
        NSOpenGLPixelFormatAttribute attribs[] = {
            NSOpenGLPFADoubleBuffer,
            NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
            0
        };
        NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:attribs];
        
        // Create an NSOpenGLContext object with the pixel format
        NSOpenGLContext *glContext = [[NSOpenGLContext alloc] initWithFormat:pixelFormat shareContext:nil];
        NSLog(@"%@", glContext);
        
        // Set the pixel format and context for the view
        MyOpenGLView *glView = [[MyOpenGLView alloc] initWithFrame:NSMakeRect(0, 0, 400, 400)];     
    
        [glView setWantsLayer: YES];
        glView.myLayer.myContext = glContext;
        [glView.myLayer setBackgroundColor:[NSColor blueColor].CGColor];
        
        
        // Set up the window and run the application
        [window setContentView:glView];
        [window makeFirstResponder:glView];
        [window makeKeyAndOrderFront:nil];
        [app run];

        
    }
    return 0;
}

Note: it's also apparently the case that when layer-backing is enabled, the draw FBO is not always 0 like it is with unlayerbacked view. Instead, if you need to draw into a specific fbo, you need to make sure you call

      GLint i = 0;
      glGetIntegerv(GL_DRAW_FRAMEBUFFER_BINDING, &i);

I also did some snooping and the restriction that you cannot pass in your own context is inherited from CAOpenGLLayer. If you need to get access to the context when setting up the views, you will notice that [layer openGLContext] is null until caopengllayerdraw is called. THis is triggered by [layer draw] so you can manually call this once the view is properly set up (i.e. bound to a window), after which you can freely access the context.

1110101001
  • 4,662
  • 7
  • 26
  • 48
  • See https://stackoverflow.com/questions/7610117/layer-backed-openglview-redraws-only-if-window-is-resized – 1110101001 Aug 18 '23 at 22:04
  • And more info on other ways to render opengl on mac: http://litherum.blogspot.com/2016/08/opengl-on-macos.html – 1110101001 Aug 18 '23 at 23:27
0

I ran into the same problem here. After playing around with Apples own GLFullscreen sample application I noticed that the error only manifested itself when using Mac OSX SDK 10.7, if I switched to 10.6 it magically vanishes.

You can find GLFullscreen here.

Probably you need to set up the frame buffer in another way in 10.7. I might investigate this if I find that using 10.6 limits me in some way. Until then I'm perfectly happy to ignore this problem.

ForceMagic
  • 6,230
  • 12
  • 66
  • 88