Here's what I am trying to do in brief:
- capture camera output using the AVCaptureVideoDataOutputSampleBufferDelegate
- then process the frames from inside the captureOutput delegate method through openGL.
Now according to Apple's examples, the way to get this working is using these steps:
creating a
CVOpenGLESTextureCache
using
CVOpenGLESTextureCacheCreateTextureFromImage
to create a texture using the texture cache and the provided sample bufferacting on this texture by passing to OpenGL
the texture is also released and the texture cache is flushed every frame
Now, on the outset this doesn't seem like the most efficient way of doing this, but it seems like it is still the fastest way around to do this sort of thing..
However, I have come across a couple of questions here, which speak to the fact that the textures do not need to be released and recreated every time a new sample buffer is procured.
OpenGL ES to video in iOS (rendering to a texture with iOS 5 texture cache)
Faster alternative to glReadPixels in iPhone OpenGL ES 2.0
I have tried this method out, but the texture doesn't seem to be updated. Which in my understanding seems about right, since I don't see how the texture would be automatically updated, since there is a new sample buffer every time.
Am I understanding this correctly, and Apple's way is the way to go for this, or is it really possible to not recreate the texture and still have it updated every time there is a new sample buffer?
Thanks!