7

I'm using glReadPixels to read data into a CVPixelBufferRef. I use the CVPixelBufferRef as the input into an AVAssetWriter. Unfortunately the pixel formats seem to be mismatched.

I think glReadPixels is returning pixel data in RGBA format while AVAssetWriter wants pixel data in ARGB format. What's the best way to convert RGBA to ARGB?

Here's what I've tried so far:

  • bit manipulation along the lines of argb = (rgba >> 8) | (rgba << 24)
  • using a CGImageRef as an intermediate step

The bit manipulation didn't work because CVPixelBufferRef doesn't seem to support subscripts. The CGImageRef intermediate step does work... but I'd prefer not to have 50 extra lines of code that could potentially be a performance hit.

Charles
  • 50,943
  • 13
  • 104
  • 142
MrDatabase
  • 43,245
  • 41
  • 111
  • 153
  • 2
    How are you getting your CVPixelBufferRef? Are you calling CVPixelBufferCreate? If so, what are you passing as the pixelFormatType parameter? What format are you passing to glReadPixels? – rob mayoff Oct 25 '11 at 21:27
  • I'm passing `kCVPixelFormatType_32ARGB` to `CVPixelBufferCreate`. I pass `GL_RBGA` to `glReadPixels`. I tried passing an RBGA format to `CVPixelBufferCreate` but it didn't work (either bad colors or the app crashed). – MrDatabase Oct 25 '11 at 22:25

5 Answers5

7

Regarding the bit manipulation, you can get a pointer to the pixel buffer's raw data:

CVPixelBufferLockBaseAddress(buffer, 0);
size_t stride = CVPixelBufferGetBytesPerRow(buffer);
char *data = (char *)CVPixelBufferGetBaseAddress(buffer);
for (size_t y = 0; y < CVPixelBufferGetHeight(buffer); ++y) {
    uint32_t *pixels = (uint32_t *)(data + stride * y);
    for (size_t x = 0; x < CVPixelBufferGetWidth(buffer); ++x)
        pixels[x] = (pixels[x] >> 8) | (pixels[x] << 24);
}
CVPixelBufferUnlockBaseAddress(buffer, 0);
rob mayoff
  • 375,296
  • 67
  • 796
  • 848
7

Better than using the CPU to swap the components would be to write a simple fragment shader to efficiently do it on the GPU as you render the image.

And the best bestest way is to completely remove the copying stage by using an iOS5 CoreVideo CVOpenGLESTextureCache which allows you to render straight to the CVPixelBufferRef, eliminating the call to glReadPixels.

p.s. I'm pretty sure AVAssetWriter wants data in BGRA format (actually it probably wants it in yuv, but that's another story).

UPDATE: as for links, the doco seems to still be under NDA, but there are two pieces of freely downloadable example code available:

GLCameraRipple and RosyWriter

The header files themselves contain good documentation, and the mac equivalent is very similar (CVOpenGLTextureCache), so you should have plenty to get you started.

Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • `OpenGLTextureCache` sounds awesome. Have any links to relevant docs? – MrDatabase Oct 26 '11 at 18:11
  • 1
    if you have an ADC login you can get the docs, added some example code to answer – Rhythmic Fistman Oct 26 '11 at 21:26
  • Looks really interesting! Thanks :-) – MrDatabase Oct 27 '11 at 21:15
  • They're The Way, you'll never look at glReadPixels & glTexImage2D the same way ever again. One caveat, kinda big: I'm pretty sure this stuff has hardware support on the iPad2 (maybe on the 4S too), on lesser hardware the timings suggest that it falls back on glReadPixels/glTexImage2D. – Rhythmic Fistman Oct 27 '11 at 21:27
  • "And the best bestest way is to completely remove the copying stage by using an iOS5 CoreVideo CVOpenGLESTextureCache which allows you to render straight to the CVPixelBufferRef, eliminating the call to glReadPixels." - can't find any documentation to this much wanted feature – Or Arbel Jan 24 '12 at 15:35
  • Read the header file: CVOpenGLESTextureCache.h – Rhythmic Fistman Jan 24 '12 at 16:00
  • I did, it says: "Creates a CVOpenGLESTexture object from an existing CVImageBuffer". This is the opposite of what we want :( – kevlar Feb 20 '12 at 09:01
  • http://stackoverflow.com/questions/9550297/faster-alternative-to-glreadpixels-in-iphone-opengl-es-2-0 – jonahb May 19 '14 at 02:19
2

It's kind of a shot in the dark, but have you tried GL_BGRA for glReadPixels with kCVPixelFormatType_32BGRA for CVPixelBufferCreate?

I suggest this because Technical Q&A QA1501 doesn't list any RGBA format as supported.

rob mayoff
  • 375,296
  • 67
  • 796
  • 848
1
    glReadPixels(0, 0, width, height, GL_BGRA, GL_UNSIGNED_BYTE, _buffer);
    CVPixelBufferRef buffer = NULL;
    CVReturn ret = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,_size.width,_size.height,kCVPixelFormatType_32BGRA,glubyte,_size.width*4,NULL,NULL,NULL,&buffer);
gngrwzrd
  • 5,902
  • 4
  • 43
  • 56
1

glReadPixels(0, 0, w*s, h*s, GL_BGRA, GL_UNSIGNED_BYTE, buffer);

Use GL_BGRA in glReadPixels. It works, just tried it myself.

JP Hribovsek
  • 6,707
  • 2
  • 21
  • 26