18

The following is code I use for reading an image from an OpenGL ES scene:

-(UIImage *)getImage{

    GLint width;

    GLint height;

    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &width);

    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &height);


    NSLog(@"%d %d",width,height);

    NSInteger myDataLength = width * height * 4;

    // allocate array and read pixels into it.
    GLubyte *buffer = (GLubyte *) malloc(myDataLength);
    glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

    // gl renders "upside down" so swap top to bottom into new array.
    // there's gotta be a better way, but this works.
    GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
    for(int y = 0; y < height; y++)
        {
        for(int x = 0; x < width * 4; x++)
            {
            buffer2[((height - 1) - y) * width * 4 + x] = buffer[y * 4 * width + x];
            }
        }

    // make data provider with data.
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, NULL);

    // prep the ingredients
    int bitsPerComponent = 8;
    int bitsPerPixel = 32;
    int bytesPerRow = 4 * width;
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

    // make the cgimage
    CGImageRef imageRef = CGImageCreate(width, height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);

    // then make the uiimage from that
    UIImage *myImage = [UIImage imageWithCGImage:imageRef];
    CGImageRelease(imageRef);
    CGDataProviderRelease(provider);
    CGColorSpaceRelease(colorSpaceRef);
    free(buffer);
    free(buffer2);
    return myImage;

}

This is working in iOS 5.x and lower versions, but on iOS 6.0 this is now returning a black image. Why is glReadPixels() failing on iOS 6.0?

Brad Larson
  • 170,088
  • 45
  • 397
  • 571
GameLoading
  • 6,688
  • 2
  • 33
  • 57
  • hm, my looks the same and works, except `GLint viewport[4]; glGetIntegerv(GL_VIEWPORT, viewport); int width = viewport[2]; int height = viewport[3]; ` – Volodymyr B. Sep 21 '12 at 10:39
  • 3
    yes, https://gist.github.com/3761227 – Volodymyr B. Sep 21 '12 at 12:35
  • @SAKrisT This is not working in iOS 6.0 – Crazy Developer Sep 21 '12 at 12:46
  • in my project works, check opengl errors glGetError(); – Volodymyr B. Sep 21 '12 at 14:20
  • 2
    Please don't roll back my edits. They made the question more descriptive and much more easily searchable. Also, this isn't related to GPUImage, so I removed that tag. – Brad Larson Oct 09 '12 at 17:03
  • @BradLarson i am getting same problem in GPUimage so i add the tag. the resolution of this code is applied same in gpuImage.... – GameLoading Oct 12 '12 at 07:29
  • @fasttrack - I don't see how, because I don't ever use `glReadPixels()` for reading from a framebuffer after it has been presented to the screen. If you're trying to read from a GPUImageView using this, you're going about things the wrong way. Instead, capture the image from the filter just before the GPUImageView using `-imageFromCurrentlyProcessedOutput`. The view is not meant to be read from. – Brad Larson Oct 12 '12 at 13:45
  • @BradLarson, I am having the same issue; not able to make it work on the device but works on the simulator. Then I noticed that the code starts working on the device as well if I send the app to the background and bring it back. It is weird but it starts working correctly after sending the app to the background once and getting it back to the foreground.. Does that give you any ideas about how this issue can be resolved? – Topsakal Aug 01 '13 at 20:02

2 Answers2

25
CAEAGLLayer *eaglLayer = (CAEAGLLayer *) self.layer;
eaglLayer.drawableProperties = @{
    kEAGLDrawablePropertyRetainedBacking: [NSNumber numberWithBool:YES],
    kEAGLDrawablePropertyColorFormat: kEAGLColorFormatRGBA8
};

set

kEAGLDrawablePropertyRetainedBacking = YES

(I do not know why this tip is going well..///)

danielbeard
  • 9,120
  • 3
  • 44
  • 58
ka2n
  • 391
  • 3
  • 10
  • 10
    The reason is probably that you're trying to read from the screen after the content has been presented using `[context presentRenderbuffer:GL_RENDERBUFFER];`. Unless you set the layer to use retained backing, as you do above, you're not guaranteed to have this content stick around after it has been presented. It appears that iOS 6.0 is more aggressive in removing this content after it is no longer needed. The above will result in a performance hit, so you might be better off performing a `glFinish()` operation and capturing the screen after that, then presenting the render buffer. – Brad Larson Sep 28 '12 at 15:00
  • 1
    I had eaglLayer.opaque = TRUE; in my code that also prevent screenshot code from working, I deleted eaglLayer.opaque = TRUE; and set kEAGLDrawablePropertyRetainedBacking = YES. Now it works. – Abbas Mousavi Oct 15 '12 at 14:22
  • How you set kEAGLDrawablePropertyRetainedBacking = YES? and where? – Mann Sep 06 '13 at 14:22
  • Hi i have set kEAGLDrawablePropertyRetainedBacking = YES but still not working – Mann Sep 09 '13 at 08:21
-4

bro try this method to get your screenshot image. The output image is MailImage

- (UIImage*)screenshot
{
    // Create a graphics context with the target size
    // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
    // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
    CGSize imageSize = [[UIScreen mainScreen] bounds].size;
    if (NULL != UIGraphicsBeginImageContextWithOptions)
        UIGraphicsBeginImageContextWithOptions(imageSize, NO, 0);
    else
        UIGraphicsBeginImageContext(imageSize);

    CGContextRef context = UIGraphicsGetCurrentContext();

    // Iterate over every window from back to front
    for (UIWindow *window in [[UIApplication sharedApplication] windows])
    {
        if (![window respondsToSelector:@selector(screen)] || [window screen] == [UIScreen mainScreen])
        {
            // -renderInContext: renders in the coordinate space of the layer,
            // so we must first apply the layer's geometry to the graphics context
            CGContextSaveGState(context);
            // Center the context around the window's anchor point
            CGContextTranslateCTM(context, [window center].x, [window center].y);
            // Apply the window's transform about the anchor point
            CGContextConcatCTM(context, [window transform]);
            // Offset by the portion of the bounds left of and above the anchor point
            CGContextTranslateCTM(context,
                                  -[window bounds].size.width * [[window layer] anchorPoint].x,
                                  -[window bounds].size.height * [[window layer] anchorPoint].y);

            // Render the layer hierarchy to the current context
            [[window layer] renderInContext:context];

            // Restore the context
            CGContextRestoreGState(context);
        }
    }

    // Retrieve the screenshot image
    Mailimage = UIGraphicsGetImageFromCurrentImageContext();

    UIGraphicsEndImageContext();

    return Mailimage;


}
Deepjyoti Roy
  • 482
  • 4
  • 15