2

I want to screen capture iOS frames into an AVAssetWriter (or even just a UIImage). The traditional method of using glReadPixels works just fine - but very slow. I understand that since iOS 5.0 I can use a different - faster, method. I followed lots of posts, like : OpenGL ES 2d rendering into image which mention the use of CVOpenGLESTextureCacheCreate - but can't get it to work.

Right now, right before every call to presentRenderbuffer: - I'm following apple's sample with glReadPixels (http://developer.apple.com/library/ios/#qa/qa1704/_index.html) which works. When trying to follow this post - OpenGL ES 2d rendering into image to get the image, which basically replaces the call to glReadPixels with creating a cache texture, and binding it to a render target (a pixel buffer), and then read from the pixel buffer - it seems to "steal" the images from the screen - so nothing is rendered.

Can anyone shed some light on how to do this? Also, please mention if this works only for OpenGL ES 2.0 - I am looking for a fast alternative which will work also on previous versions.

A code sample would be excellent.

Community
  • 1
  • 1
user1574100
  • 171
  • 1
  • 5
  • You aren't trying to present your texture-backed FBO to the screen, are you? You're probably going to want to render to your texture, grab the bytes from the associated pixel buffer, then re-render the texture to the screen via a quad and a passthrough shader. – Brad Larson Aug 03 '12 at 15:03
  • Thanks, can you add some code samples for the entire process ? I'm kind of new to OpenGL (as mentioned, the starting point is presentRenderbuffer: - which my current code reads the frame buffer using glReadPixels) Thanks! – user1574100 Aug 04 '12 at 17:53

0 Answers0