11

Is it possible using video (pre-rendered, compressed with H.264) as texture for GL in iOS?

If possible, how to do it? And any playback quality/frame-rate or limitations?

eonil
  • 83,476
  • 81
  • 317
  • 516

2 Answers2

19

As of iOS 4.0, you can use AVCaptureDeviceInput to get the camera as a device input and connect it to a AVCaptureVideoDataOutput with any object you like set as the delegate. By setting a 32bpp BGRA format for the camera, the delegate object will receive each frame from the camera in a format just perfect for handing immediately to glTexImage2D (or glTexSubImage2D if the device doesn't support non-power-of-two textures; I think the MBX devices fall into this category).

There are a bunch of frame size and frame rate options; at a guess you'll have to tweak those depending on how much else you want to use the GPU for. I found that a completely trivial scene with just a textured quad showing the latest frame, redrawn only exactly when a new frame arrives on an iPhone 4, was able to display that device's maximum 720p 24fps feed without any noticeable lag. I haven't performed any more thorough benchmarking than that, so hopefully someone else can advise.

In principle, per the API, frames can come back with some in-memory padding between scanlines, which would mean some shuffling of contents before posting off to GL so you do need to implement a code path for that. In practice, speaking purely empirically, it appears that the current version of iOS never returns images in that form so it isn't really a performance issue.

EDIT: it's now very close to three years later. In the interim Apple has released iOS 5, 6 and 7. With 5 they introduced CVOpenGLESTexture and CVOpenGLESTextureCache, which are now the smart way to pipe video from a capture device into OpenGL. Apple supplies sample code here, from which the particularly interesting parts are in RippleViewController.m, specifically its setupAVCapture and captureOutput:didOutputSampleBuffer:fromConnection: — see lines 196–329. Sadly the terms and conditions prevent a duplication of the code here without attaching the whole project but the step-by-step setup is:

  1. create a CVOpenGLESTextureCacheCreate and an AVCaptureSession;
  2. grab a suitable AVCaptureDevice for video;
  3. create an AVCaptureDeviceInput with that capture device;
  4. attach an AVCaptureVideoDataOutput and tell it to call you as a sample buffer delegate.

Upon receiving each sample buffer:

  1. get the CVImageBufferRef from it;
  2. use CVOpenGLESTextureCacheCreateTextureFromImage to get Y and UV CVOpenGLESTextureRefs from the CV image buffer;
  3. get texture targets and names from the CV OpenGLES texture refs in order to bind them;
  4. combine luminance and chrominance in your shader.
Tommy
  • 99,986
  • 12
  • 185
  • 204
  • Thanks for answer. I'll dig it. – eonil Nov 22 '10 at 07:35
  • 1
    It's only just about 200 lines to tie the whole thing together; sorry for not posting code - I've tackled the problem only while at work so am contractually barred from posting what I have. But it's really trivial stuff and I definitely spent a lot longer figuring out which were the appropriate classes in the documentation than I did coding. – Tommy Nov 22 '10 at 10:30
  • My blog post about using 32BPP images as textures in OpenGL, includes xcode project and source code: http://www.modejong.com/blog/post7_load_opengl_textures_with_alpha_channel_on_ios/index.html – MoDJ Jul 06 '13 at 03:02
  • @Tommy all the answers on this topic on SO right now are "it's easy. I'm not going to show you how. It's easy. Just dig in the APIs. Yeah". I appreciate you wrote something quickly, but that was 3 years ago, and all SO has right now are terrible answers on this topic. – Adam Nov 04 '13 at 12:17
  • 1
    @Adam since iOS 5 (about a year after my original answer) Apple has provided built-in functionality to commute `CVImageBuffer`s into OpenGL and has supplied a sample project; I've added a link as well as a more detailed breakdown of the steps involved. – Tommy Nov 04 '13 at 20:26
  • @Tommy - that's awesome, thanks! None of the other SO answers I saw mentioned any of that. Looks like a huge improvement :) – Adam Nov 05 '13 at 14:54
  • @Tommy - Apple's sample project is awful - it wastes time making a complex graphical effect that makes it impossible to re-use the code in a normal app. I've spent hours trying to remove the useless bits, but something buried in there is necessary to make the main code work, and I can't figure out what :(. – Adam Dec 21 '13 at 14:47
  • @Tommy I followed your answer and dig it , i found GPUImage and make some demo but i got a problem is the render frame not the same time , could you tell me how can i fix it ? http://stackoverflow.com/questions/36290137/gpuimage-render-video-as-texture-but-not-the-same-time – Allan Mar 30 '16 at 02:53
0

Use RosyWriter for a MUCH better example of how to do OpenGL video rendering. Performance is very good, especially if you reduce the framerate (~10% at 1080P/30, >=5% at 1080P/15.

sounder_michael
  • 576
  • 6
  • 17