2

I used Apple´s implementation to render a OpenGL view to a UIImage : http://developer.apple.com/library/ios/#qa/qa1704/_index.html

This is working on simulator, but when I try it on the device it just shows a blank image.

The code I am using is this:

// OpenGL view rendering to UIView
- (UIImage*)snapsotOfOpenGLView:(UIView*)eaglview
{
   GLint backingWidth, backingHeight;

  // Bind the color renderbuffer used to render the OpenGL ES view
  // If your application only creates a single color renderbuffer which is already bound       at this point,
  // this call is redundant, but it is needed if you're dealing with multiple renderbuffers.
  // Note, replace "_colorRenderbuffer" with the actual name of the renderbuffer object defined in your class.

  //    glBindRenderbufferOES(GL_RENDERBUFFER_OES, _colorRenderbuffer);

  // Get the size of the backing CAEAGLLayer
  glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
  glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);

  NSInteger x = 0, y = 0, width = backingWidth, height = backingHeight;
  NSInteger dataLength = width * height * 4;
  GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));

  // Read pixel data from the framebuffer
  glPixelStorei(GL_PACK_ALIGNMENT, 4);
  glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);

  // Create a CGImage with the pixel data
  // If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the  alpha channel
  // otherwise, use kCGImageAlphaPremultipliedLast
  CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
  CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
  CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace,  kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast,
                                ref, NULL, true, kCGRenderingIntentDefault);

  // OpenGL ES measures data in PIXELS
  // Create a graphics context with the target size measured in POINTS
  NSInteger widthInPoints, heightInPoints;
  if (NULL != UIGraphicsBeginImageContextWithOptions) {
      // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
      // Set the scale parameter to your OpenGL ES view's contentScaleFactor
      // so that you get a high-resolution snapshot when its value is greater than 1.0
      CGFloat scale = eaglview.contentScaleFactor;
      widthInPoints = width / scale;
      heightInPoints = height / scale;
      UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale);
  } else {
    // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
    widthInPoints = width;
    heightInPoints = height;
    UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints));
  }

  CGContextRef cgcontext = UIGraphicsGetCurrentContext();

  // UIKit coordinate system is upside down to GL/Quartz coordinate system
  // Flip the CGImage by rendering it to the flipped bitmap context
  // The size of the destination area is measured in POINTS
  CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
  CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref);

  // Retrieve the UIImage from the current context
  UIImage *image = UIGraphicsGetImageFromCurrentImageContext();

  UIGraphicsEndImageContext();

  // Clean up
  free(data);
  CFRelease(ref);
  CFRelease(colorspace);
  CGImageRelease(iref);

 return image;
}

Does anyone know why this could be happening?

I tried it with XCode 4.5 with iOS 5.1 and 6.0 simulator and it works.

It did not work for devices iPad2 iOS 6.0.1 and iPhone 4S iOS 6.0.1

Thanks.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
fede
  • 58
  • 1
  • 7
  • The above-linked question has an answer that provides a workaround, but the reason for this is that `glReadPixels()` in iOS 6.0 returns nothing after a scene has been presented to the screen. You either have to use retained backing (and sacrifice a little rendering performance) or capture the rendered scene before `-presentRenderbuffer:` is called. – Brad Larson Feb 14 '13 at 18:08
  • I am not sure this is the issue, as it is working on iOS 6.0 simulator. However I will try it and see if it works – fede Feb 19 '13 at 12:00
  • Hi, i am having same problem, how can i use retained backing? – Mann Sep 06 '13 at 15:44

0 Answers0