I want to screen capture iOS frames into an AVAssetWriter
(or even just a UIImage
). The traditional method of using glReadPixels
works just fine - but very slow. I understand that since iOS 5.0 I can use a different - faster, method.
I followed lots of posts, like : OpenGL ES 2d rendering into image
which mention the use of CVOpenGLESTextureCacheCreate
- but can't get it to work.
Right now, right before every call to presentRenderbuffer: - I'm following apple's sample with glReadPixels
(http://developer.apple.com/library/ios/#qa/qa1704/_index.html) which works. When trying to follow this post - OpenGL ES 2d rendering into image to get the image, which basically replaces the call to glReadPixels with creating a cache texture, and binding it to a render target (a pixel buffer), and then read from the pixel buffer - it seems to "steal" the images from the screen - so nothing is rendered.
Can anyone shed some light on how to do this? Also, please mention if this works only for OpenGL ES 2.0 - I am looking for a fast alternative which will work also on previous versions.
A code sample would be excellent.