0

I'm trying to take a screenshot of a GLView on the iPhone. I wrote the following code:

[self setContext];

GLint backWidth, backHeight;

glGetRenderbufferParameterivOES( GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backWidth );
glGetRenderbufferParameterivOES( GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backHeight );

int         dataLength  = backWidth * backHeight * 4;
uint32_t*   pData       = (uint32_t*)malloc( dataLength * sizeof( char ) );
memset( pData, 0xff, dataLength ); // This is here to confirm some writing occurs in glReadPixels

// Read pixel data from the framebuffer
//glPixelStorei( GL_PACK_ALIGNMENT, 4 );
glReadPixels( 0, 0, backWidth, backHeight, GL_RGBA, GL_UNSIGNED_BYTE, pData );
fprintf( stderr, "%d\n", glGetError() );

CGDataProviderRef   cgDataProvider  = CGDataProviderCreateWithData( NULL, pData, dataLength, DataProviderReleaseDataCallback );
CGColorSpaceRef     cgColorSpace    = CGColorSpaceCreateDeviceRGB();
CGImageRef          cgImage         = CGImageCreate(    backWidth, backHeight, 8, 32, backWidth * 4, cgColorSpace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast,
                                                        cgDataProvider, NULL, true, kCGRenderingIntentDefault );

//NSData*       pNSData         = [[NSData alloc] initWithBytes: pData length: dataLength];
UIImage*    pRetImage       = [UIImage imageWithCGImage: cgImage scale: 1.0f orientation: UIImageOrientationDownMirrored];
CFRelease( cgDataProvider );
CFRelease( cgColorSpace );
CGImageRelease( cgImage );
//free( pData );

return pRetImage;

Which works perfectly in the simulator. Unfortunately the moment I tried to run it on the iPhone 4S glReadPixels does nothing. I deliberately memset the array to 0xff to see if I could see if it was doing anything and it doesn't matter what I set the array too glReadPixels does nothing. It also reports no errors.

I'm not using a multisample buffer unless the iPhone does the set up for me.

glGenRenderbuffers( 1, &mGlRenderBuffer );
glBindRenderbuffer( GL_RENDERBUFFER, mGlRenderBuffer );

[mGlContext renderbufferStorage: GL_RENDERBUFFER fromDrawable: mGlLayer];

GLuint frameBuffer;
glGenFramebuffers( 1, &frameBuffer );
glBindFramebuffer( GL_FRAMEBUFFER, frameBuffer );
glFramebufferRenderbuffer( GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, mGlRenderBuffer );

Has anyone any ideas whats going on? Its driving me mad :(

Goz
  • 61,365
  • 24
  • 124
  • 204

1 Answers1

1

Turns out it was that iOS6 is a bit more picky about setting kEAGLDrawablePropertyRetainedBacking ...

eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:
                                    [NSNumber numberWithBool:YES],
                                    kEAGLDrawablePropertyRetainedBacking,
                                    kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat,
                                    nil];

See here

Community
  • 1
  • 1
Goz
  • 61,365
  • 24
  • 124
  • 204
  • 2
    The reason for this isn't so much that iOS 6 is more picky about retained backing, it's that reading from the framebuffer after it's presented to the screen is an unsupported operation, and iOS 6.0 finally enforces that. Retaining the backing is one way to keep this content around, but doing so may have negative performance consequences. Alternatively, you can use glReadPixels() before -presentRenderbuffer: to grab your rendered image, without the retained backing. – Brad Larson Nov 06 '12 at 19:19