1

I have a app similar to GLPaint example(but on OpenGL ES 2.0). I want to get screenshots of drawing view in some moments. I've already read this topic

but I don't understand in which moment I should call CVOpenGLESTextureCacheCreate and do other things. Who can help me please?

Community
  • 1
  • 1
Siarhei Fedartsou
  • 1,843
  • 6
  • 30
  • 44

3 Answers3

2

The code I describe in the answer you link to describes the creation of a pixel buffer, the extraction of its matching texture, and the binding of that texture as the output of a framebuffer. You use that code once to set up the framebuffer you will render your scene to.

Whenever you want to capture from that texture, you'll probably want to use glFinish() to block until all OpenGL ES rendering has completed, then use the code I describe there:

CVPixelBufferLockBaseAddress(renderTarget, 0);
_rawBytesForImage = (GLubyte *)CVPixelBufferGetBaseAddress(renderTarget);
// Do something with the bytes
CVPixelBufferUnlockBaseAddress(renderTarget, 0);

to extract the raw bytes for the texture containing the image of your scene.

The internal byte ordering of the iOS textures is BGRA, so you'll want to use something like the following to create a CGImageRef from those bytes:

// It appears that the width of a texture must be padded out to be a multiple of 8 (32 bytes) if reading from it using a texture cache
NSUInteger paddedWidthOfImage = CVPixelBufferGetBytesPerRow(renderTarget) / 4.0;
NSUInteger paddedBytesForImage = paddedWidthOfImage * (int)currentFBOSize.height * 4;
dataProvider = CGDataProviderCreateWithData((__bridge_retained void*)self, _rawBytesForImage, paddedBytesForImage, dataProviderUnlockCallback);

cgImageFromBytes = CGImageCreate((int)currentFBOSize.width, (int)currentFBOSize.height, 8, 32, CVPixelBufferGetBytesPerRow(renderTarget), defaultRGBColorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst, dataProvider, NULL, NO, kCGRenderingIntentDefault);

In the above, I use a dataProviderUnlockCallback() function to handle the unlocking of the pixel buffer and safe resumption of rendering, but you can probably ignore that in your case and just pass in NULL for the parameter there.

Community
  • 1
  • 1
Brad Larson
  • 170,088
  • 45
  • 397
  • 571
  • When should I bind texture as the output of a framebuffer? in the framebuffers creation code? – Siarhei Fedartsou Sep 12 '12 at 15:20
  • 1
    @miksayer - Yes, you need to do this as part of your framebuffer setup, as I indicate above. – Brad Larson Sep 12 '12 at 15:21
  • http://pastie.org/4712848 is is my code of framebuffer setup. I have EXC_BAD_ACCESS at glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture)); line. What am I doing wrong? – Siarhei Fedartsou Sep 13 '12 at 09:51
  • I've fixed EXC_BAD_ACCESS now http://pastie.org/4719978 . But if I try to get screenshot with this code http://pastie.org/4719980 I get only empty array of byte. Where am I wrong? help me pls – Siarhei Fedartsou Sep 14 '12 at 13:54
0

Another option is:

  1. Create a new CGBitmapContext
  2. Call [layer renderInContext:yourNewContext]
  3. Get a CGImage from the bitmap context
  4. Get a pixel buffer, put the CGImage into it, and append it to an AssetWriter using an AVAdaptor.

Best of luck!

EDIT

This may not work: I'm testing it now and will re-post asap.

Sam Ballantyne
  • 487
  • 6
  • 18
  • Unfortunately, that won't work for a CAEAGLLayer, like is used here. `-renderInContext:` doesn't capture OpenGL ES content. – Brad Larson Aug 18 '13 at 16:55
  • Really? Yikes. That's great to know. Out of curiously: will your render-to-texture method work with a GL context with retained backing? – Sam Ballantyne Aug 18 '13 at 17:16
  • I believe so, but I've never tried. As long as the rendering completes through to the texture, you'll be able to pull bytes from its matching CVPixelBuffer. – Brad Larson Aug 19 '13 at 14:28
  • What would the target chain be for a painting app that records the screen, in your framework? GPUImageView -> GPUImageMovieWriter? Or vice versa? – Sam Ballantyne Aug 30 '13 at 15:35
0

Try the VTCreateCGImageFromCVPixelBuffer of VideoToolbox (available for iOS 9.x+):

OSStatus VTCreateCGImageFromCVPixelBuffer(CVPixelBufferRef pixelBuffer, CFDictionaryRef options, CGImageRef  _Nullable *imageOut);
Parameters

Specify the pixel buffer that will be the image data source for your CGImage (set options to NULL).

James Bush
  • 1,485
  • 14
  • 19