Note: the below only applies to 10.9 (and maybe 10.8, I haven't tested). For newer osx versions, the only really supported way to draw OpenGL content is using CAOpenGLLayer (NSOpenGLView technically still works, but has bugs like vsync not working, render glitches, etc.)
With the above aside, it seems @Amber Dixon was right on the money with regard to this being an issue with layer-backed OpenGL views [on 10.9(-10.8?)]. There are a few ways we can create them, some more bugged than others, so let's go through them.
Below I have a snippet of a simple OpenGL example to draw a triangle, subclassing NSOpenGLView
. You will note that it is layer-backed, and on 10.9.5, it works exactly like it should.
#import <Cocoa/Cocoa.h>
#import <OpenGL/gl.h>
#import <OpenGL/glu.h>
#import <QuartzCore/QuartzCore.h>
#import <OpenGL/gl3.h>
@interface MyOpenGLView : NSOpenGLView
@end
@implementation MyOpenGLView
+ (BOOL)canDrawOpenGL {
return YES;
}
- (void)prepareOpenGL {
// Set up OpenGL context and other properties
[super prepareOpenGL];
[[self openGLContext] makeCurrentContext];
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
}
- (void)drawRect:(NSRect)dirtyRect {
// Clear the view with the clear color
glClear(GL_COLOR_BUFFER_BIT);
// Set up the vertex data for the triangle
GLfloat vertices[] = {
-0.5f, -0.5f, 0.0f,
0.5f, -0.5f, 0.0f,
0.0f, 0.5f, 0.0f
};
// Set up the vertex buffer object (VBO)
GLuint vbo;
glGenBuffers(1, &vbo);
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
// Set up the vertex shader
const GLchar* vertexSource =
"#version 150\n"
"in vec3 position;"
"void main() {"
" gl_Position = vec4(position, 1.0);"
"}";
GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);
glShaderSource(vertexShader, 1, &vertexSource, NULL);
glCompileShader(vertexShader);
// Set up the fragment shader
const GLchar* fragmentSource =
"#version 150\n"
"out vec4 outColor;"
"void main() {"
" outColor = vec4(1.0, 1.0, 1.0, 1.0);"
"}";
GLuint fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
glShaderSource(fragmentShader, 1, &fragmentSource, NULL);
glCompileShader(fragmentShader);
// Set up the shader program
GLuint shaderProgram = glCreateProgram();
glAttachShader(shaderProgram, vertexShader);
glAttachShader(shaderProgram, fragmentShader);
glBindFragDataLocation(shaderProgram, 0, "outColor");
glLinkProgram(shaderProgram);
glUseProgram(shaderProgram);
// Set up the vertex array object (VAO)
GLuint vao;
glGenVertexArrays(1, &vao);
glBindVertexArray(vao);
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vbo);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, NULL);
// Draw the triangle
glDrawArrays(GL_TRIANGLES, 0, 3);
// Clean up
glDeleteProgram(shaderProgram);
glDeleteShader(fragmentShader);
glDeleteShader(vertexShader);
glDeleteBuffers(1, &vbo);
glDeleteVertexArrays(1, &vao);
// Swap the buffers to display the rendered image
[[self openGLContext] flushBuffer];
}
@end
int main(int argc, const char * argv[]) {
// Set up the application and window
@autoreleasepool {
NSApplication *app = [NSApplication sharedApplication];
NSWindow *window = [[NSWindow alloc] initWithContentRect:NSMakeRect(0, 0, 400, 400)
styleMask:NSTitledWindowMask | NSClosableWindowMask | NSMiniaturizableWindowMask
backing:NSBackingStoreBuffered
defer:NO];
// Enable layer-backing for the window
[window setBackingType:NSBackingStoreBuffered];
// Create an NSOpenGLPixelFormat object with double buffering enabled
NSOpenGLPixelFormatAttribute attribs[] = {
NSOpenGLPFADoubleBuffer,
NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
0
};
NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:attribs];
// Create an NSOpenGLContext object with the pixel format
NSOpenGLContext *glContext = [[NSOpenGLContext alloc] initWithFormat:pixelFormat shareContext:nil];
// Set the pixel format and context for the view
MyOpenGLView *glView = [[MyOpenGLView alloc] initWithFrame:NSMakeRect(0, 0, 400, 400) pixelFormat:pixelFormat];
[glView setOpenGLContext:glContext];
// Create a CAOpenGLLayer instance and set it as the layer for the view
[glView setWantsLayer: YES];
[glView.layer setBackgroundColor:[NSColor blueColor].CGColor];
// Set up the window and run the application
[window setContentView:glView];
[window makeFirstResponder:glView];
[window makeKeyAndOrderFront:nil];
[app run];
}
return 0;
}
However, there are some interesting points to note. First, note that NSOpenGLView
has an setOpenGLContext:
method, and a way to set the pixel format (via setPixelFormat:
or in the constructor). You might wonder why this is even necessary, since the latter can be derived from the former. In fact, it turns out to be a subtle pitfall. If you change the line to such
MyOpenGLView *glView = [[MyOpenGLView alloc] initWithFrame:NSMakeRect(0, 0, 400, 400)];
now not passing in the pixelFormat
(but still later explicitly setting openGLContext), you'll find that it fails with setWantsLayer: YES
but works fine when setWantsLayer: NO
.
It turns out that when you request a layer-backed context, NSOpenGLView
actually creates a brand new context from the pixel-format (using a default pixformat if none was set), rather than obeying the one you passed in. This can be seen from the disassembly
void * -[NSOpenGLView openGLContext](void * self, void * _cmd) {
rbx = self;
if ([self _isLayerBacked] != 0x0) {
rax = [rbx _layerBackedOpenGLContext];
}
else {
rax = rbx->_openGLContext;
if (rax == 0x0) {
r14 = [NSOpenGLContext alloc];
rax = rbx->_pixelFormat;
if (rax == 0x0) {
rax = [rbx class];
rax = [rax defaultPixelFormat];
}
objc_assign_ivar([r14 initWithFormat:rax shareContext:0x0], rbx, *_OBJC_IVAR_$_NSOpenGLView._openGLContext);
[rbx->_openGLContext makeCurrentContext];
rax = rbx->_openGLContext;
}
}
return rax;
}
That is, a view has a separate _layerBackedOpenGLContext:
and a _surfaceBackedOpenGLContext
, and whenever setWantsLayer
is toggled, a brand new new context is created and assigned.
Another issue is seen in the following snippet. Now instead of subclassing NSOpenGLView
, let's just subclass NSView
directly and bind the NSOpenGLContext
ourselves. According to apple, there's nothing that indicates that NSOpenGLView
is anything "special" other than just a convenience wrapper around NSView
that takes care of initializing and managing the context, so you'd expect it to be straightforward:
#import <Cocoa/Cocoa.h>
#import <OpenGL/gl.h>
#import <OpenGL/glu.h>
#import <QuartzCore/QuartzCore.h>
#import <OpenGL/gl3.h>
@interface MyOpenGLView : NSView
@property (nonatomic, strong) NSOpenGLContext *openGLContext;
@end
@implementation MyOpenGLView
+ (BOOL)canDrawOpenGL {
return YES;
}
- (void)prepareOpenGL {
// Set up OpenGL context and other properties
[[self openGLContext] makeCurrentContext];
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
}
- (void)drawRect:(NSRect)dirtyRect {
[[self openGLContext] setView: self];
[[self openGLContext] makeCurrentContext];
// Same calls as before, elided for brevity
// Swap the buffers to display the rendered image
[[self openGLContext] flushBuffer];
}
@end
int main(int argc, const char * argv[]) {
// Set up the application and window
@autoreleasepool {
NSApplication *app = [NSApplication sharedApplication];
NSWindow *window = [[NSWindow alloc] initWithContentRect:NSMakeRect(0, 0, 400, 400)
styleMask:NSTitledWindowMask | NSClosableWindowMask | NSMiniaturizableWindowMask
backing:NSBackingStoreBuffered
defer:NO];
// Enable layer-backing for the window
[window setBackingType:NSBackingStoreBuffered];
// Create an NSOpenGLPixelFormat object with double buffering enabled
NSOpenGLPixelFormatAttribute attribs[] = {
NSOpenGLPFADoubleBuffer,
NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
0
};
NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:attribs];
// Create an NSOpenGLContext object with the pixel format
NSOpenGLContext *glContext = [[NSOpenGLContext alloc] initWithFormat:pixelFormat shareContext:nil];
// Set the pixel format and context for the view
MyOpenGLView *glView = [[MyOpenGLView alloc] initWithFrame:NSMakeRect(0, 0, 400, 400)];
[glView setOpenGLContext:glContext];
[glView setWantsLayer: YES];
[glView.layer setBackgroundColor:[NSColor blueColor].CGColor];
// Set up the window and run the application
[window setContentView:glView];
[window makeFirstResponder:glView];
[window makeKeyAndOrderFront:nil];
[app run];
}
return 0;
}
However when you run this (on 10.9), you'll find that it does not work at all. But setting setWantsLayer: NO
does work! Even more interestingly, it runs fine on newer osx versions, so the code itself probably isn't wrong. And when you print out the value of [self _layerBackedOpenGLContext]
from drawRectangle
you find that the context is indeed being set correctly. So what gives?
Comparing the results of [self layer]
in both examples gives us a hint as to the answer: when we subclass NSOpenGLView
, the layer is set to an NSOpenGLLayer. But in the latter, the layer remains the default NSView layer. (It might also interest you to note that on newer osx versions, the backing layer for both is an NSCGLSurface as described here). So it seems that either intentionally or by mistake, when you create your own layer-backed NSView with a bound OpenGLContext, the context is never actually bound to a layer, which might explain the error you saw.
Edit: So it turns out if you want to reimplement a layer-backed NSOpenGLView yourself and do it properly, you're responsible for providing the backing CALayer (as mentioned, it works even if you don't do this on 10.10+, but it's undefined behavior). And moreover, you cannot set your own context, you have to use the context the layer gives you.
#import <Cocoa/Cocoa.h>
#import <OpenGL/gl.h>
#import <OpenGL/glu.h>
#import <QuartzCore/QuartzCore.h>
#import <OpenGL/gl3.h>
@interface MyLayer : NSOpenGLLayer
@property (atomic, strong) NSOpenGLContext *myContext;
@end
@implementation MyLayer
- (BOOL)canDrawInCGLContext:(CGLContextObj)ctx pixelFormat:(CGLPixelFormatObj)pf
forLayerTime:(CFTimeInterval)t displayTime:(const CVTimeStamp *)ts
{
return YES;
}
- (NSOpenGLPixelFormat *)openGLPixelFormatForDisplayMask:(uint32_t)mask {
return [self openGLPixelFormat];
}
-(NSOpenGLPixelFormat*) openGLPixelFormat {
return [[NSOpenGLPixelFormat alloc] initWithCGLPixelFormatObj: CGLGetPixelFormat([self.myContext CGLContextObj])];
}
@end
@interface MyOpenGLView : NSView
@property (atomic, strong) MyLayer *myLayer;
@end
@implementation MyOpenGLView
+ (BOOL)canDrawOpenGL {
return YES;
}
- (MyLayer *)makeBackingLayer {
self.myLayer = [[MyLayer alloc] init];
self.myLayer.view = self;
self.myLayer.asynchronous = NO;
return self.myLayer;
}
- (void)prepareOpenGL {
// Set up OpenGL context and other properties
[[self.myLayer openGLContext] makeCurrentContext];
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
}
- (void)drawRect:(NSRect)dirtyRect {
// self.layer.openGLContext = self.openGLContext;
[[self.myLayer openGLContext] setView: self];
[[self.myLayer openGLContext] makeCurrentContext];
NSLog(@"%@", self.myLayer);
NSLog(@"%@", self.layer);
NSLog(@"%@", [self.myLayer openGLContext]);
NSLog(@"%@", [self.myLayer myContext]);
// <snip triangle>
glFlush();
// Swap the buffers to display the rendered image
[[self.myLayer openGLContext] flushBuffer];
[self.myLayer setNeedsDisplay];
[self setNeedsDisplay: YES];
}
@end
int main(int argc, const char * argv[]) {
// Set up the application and window
@autoreleasepool {
NSApplication *app = [NSApplication sharedApplication];
NSWindow *window = [[NSWindow alloc] initWithContentRect:NSMakeRect(0, 0, 400, 400)
styleMask:NSTitledWindowMask | NSClosableWindowMask | NSMiniaturizableWindowMask
backing:NSBackingStoreBuffered
defer:NO];
// Enable layer-backing for the window
[window setBackingType:NSBackingStoreBuffered];
// Create an NSOpenGLPixelFormat object with double buffering enabled
NSOpenGLPixelFormatAttribute attribs[] = {
NSOpenGLPFADoubleBuffer,
NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
0
};
NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:attribs];
// Create an NSOpenGLContext object with the pixel format
NSOpenGLContext *glContext = [[NSOpenGLContext alloc] initWithFormat:pixelFormat shareContext:nil];
NSLog(@"%@", glContext);
// Set the pixel format and context for the view
MyOpenGLView *glView = [[MyOpenGLView alloc] initWithFrame:NSMakeRect(0, 0, 400, 400)];
[glView setWantsLayer: YES];
glView.myLayer.myContext = glContext;
[glView.myLayer setBackgroundColor:[NSColor blueColor].CGColor];
// Set up the window and run the application
[window setContentView:glView];
[window makeFirstResponder:glView];
[window makeKeyAndOrderFront:nil];
[app run];
}
return 0;
}
Note: it's also apparently the case that when layer-backing is enabled, the draw FBO is not always 0 like it is with unlayerbacked view. Instead, if you need to draw into a specific fbo, you need to make sure you call
GLint i = 0;
glGetIntegerv(GL_DRAW_FRAMEBUFFER_BINDING, &i);
I also did some snooping and the restriction that you cannot pass in your own context is inherited from CAOpenGLLayer. If you need to get access to the context when setting up the views, you will notice that [layer openGLContext]
is null until caopengllayerdraw
is called. THis is triggered by [layer draw]
so you can manually call this once the view is properly set up (i.e. bound to a window), after which you can freely access the context.