12

I'm recording live video in my iOS app. On another Stack Overflow page, I found that you can use vImage_Buffer to work on my frames.

The problem is that I have no idea how to get back to a CVPixelBufferRef from the outputted vImage_buffer.

Here is the code that is given in the other article:

NSInteger cropX0 = 100,
          cropY0 = 100,
          cropHeight = 100,
          cropWidth = 100,
          outWidth = 480,
          outHeight = 480;

CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);                   
CVPixelBufferLockBaseAddress(imageBuffer,0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);

vImage_Buffer inBuff;                       
inBuff.height = cropHeight;
inBuff.width = cropWidth;
inBuff.rowBytes = bytesPerRow;

int startpos = cropY0 * bytesPerRow + 4 * cropX0;
inBuff.data = baseAddress + startpos;

unsigned char *outImg = (unsigned char*)malloc(4 * outWidth * outHeight);
vImage_Buffer outBuff = {outImg, outHeight, outWidth, 4 * outWidth};

vImage_Error err = vImageScale_ARGB8888(&inBuff, &outBuff, NULL, 0);
if (err != kvImageNoError) NSLog(@" error %ld", err);

And now I need to convert outBuff to a CVPixelBufferRef.

I assume I need to use vImageBuffer_CopyToCVPixelBuffer, but I'm not sure how.

My first attempts failed with an EXC_BAD_ACCESS: CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

CVPixelBufferRef pixelBuffer;
CVPixelBufferCreate(kCFAllocatorSystemDefault, 480, 480, kCVPixelFormatType_32BGRA, NULL, &pixelBuffer);
    
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    
vImage_CGImageFormat format = {
    .bitsPerComponent = 8,
    .bitsPerPixel = 32,
    .bitmapInfo = kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst,  //BGRX8888
    .colorSpace = NULL,  //sRGB
};
    
vImageBuffer_CopyToCVPixelBuffer(&outBuff,
                                 &format,
                                 pixelBuffer,
                                 NULL,
                                 NULL,
                                 kvImageNoFlags);  // Here is the crash!
    
    
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

Any idea?

Vukašin Manojlović
  • 3,717
  • 3
  • 19
  • 31
Nils Ziehn
  • 4,118
  • 6
  • 26
  • 40
  • To which chunk of bytes does the EXC_BAD_ACCESS correspond? The vImage_Buffer.data or the CVPixelBufferRef base address? – Ian Ollmann Mar 04 '15 at 21:00
  • HI Ian, how exactly can I find that out? – Nils Ziehn Mar 05 '15 at 15:01
  • Run your code in the debugger. When you crash, you should see something like "EXC_BAD_ACCESS (Code=X, address=ADDR)". At that point, use the debugger to look at the values of pixelBuffer and outBuff.data. One of them should be the same as or close to ADDR. – Stephen Canon Mar 05 '15 at 15:36
  • (This is assuming that you're actually crashing in CopyToCVPixelBuffer, and not earlier; what function are you crashing in?) – Stephen Canon Mar 05 '15 at 15:37
  • Well, the debugger holds at vImageBuffer_CopyToCVPixelBuffer but the addr is actually 0x0, but all input arguments outBuff,format and pixelBuffer are something different to 0x0. The code is 1 – Nils Ziehn Mar 05 '15 at 17:08
  • Seems worthy of a bug report if you have a small reproducible code example. It could be that the NULL color space or background color didn't get handled correctly or something. Maybe the new pixel buffer did something unexpected. – Ian Ollmann Mar 05 '15 at 18:43
  • A crash report log would be handy too, in case it doesn't reproduce. – Ian Ollmann Mar 05 '15 at 18:49
  • No, sorry :/ I stopped looking after a few hours of research – Nils Ziehn Apr 26 '15 at 09:43

2 Answers2

2
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
    [NSNumber numberWithBool : YES], kCVPixelBufferCGImageCompatibilityKey,
    [NSNumber numberWithBool : YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
    [NSNumber numberWithInt : 480], kCVPixelBufferWidthKey,
    [NSNumber numberWithInt : 480], kCVPixelBufferHeightKey,
    nil];

status = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                      480,
                                      480,
                                      kCVPixelFormatType_32BGRA,
                                      outImg,
                                      bytesPerRow,
                                      NULL,
                                      NULL,
                                      (__bridge CFDictionaryRef)options,
                                      &pixbuffer);

You should generate a new pixelBuffer like above.

Vukašin Manojlović
  • 3,717
  • 3
  • 19
  • 31
小东邪
  • 21
  • 5
1
  • Just in case... if you want a cropped live video feed into your interface, use an AVPlayerLayer, AVCaptureVideoPreviewLayer and/or other CALayer subclasses, use the layer bounds, frame and position for your 100x100 pixel area to 480x480 area.

Notes for vImage for your question (different circumstances may differ):

  1. CVPixelBufferCreateWithBytes will not work with vImageBuffer_CopyToCVPixelBuffer() because you need to copy the vImage_Buffer data into a "clean" or "empty" CVPixelBuffer.

  2. No need for locking/unlocking - make sure you know when to lock & when not to lock pixel buffers.

  3. Your inBuff vImage_Buffer just needs to be initialized from the pixel buffer data, not manually (unless you know how to use CGContexts etc, to init the pixel grid)

  4. use vImageBuffer_InitWithCVPixelBuffer()

  5. vImageScale_ARGB8888 will scale the entire CVPixel data to a smaller/larger rectangle. It won't SCALE a portion/crop area of the buffer to another buffer.

  6. When you use vImageBuffer_CopyToCVPixelBuffer(), vImageCVImageFormatRef & vImage_CGImageFormat need to be filled out correctly.

    CGColorSpaceRef dstColorSpace = CGColorSpaceCreateWithName(kCGColorSpaceITUR_709);
    
    vImage_CGImageFormat format = {
        .bitsPerComponent = 16,
        .bitsPerPixel = 64,
        .bitmapInfo = (CGBitmapInfo)kCGImageAlphaPremultipliedLast  |  kCGBitmapByteOrder16Big ,
        .colorSpace = dstColorSpace
    };
    vImageCVImageFormatRef vformat = vImageCVImageFormat_Create(kCVPixelFormatType_4444AYpCbCr16,
                                                                kvImage_ARGBToYpCbCrMatrix_ITU_R_709_2,
                                                                kCVImageBufferChromaLocation_Center,
                                                                format.colorSpace,
                                                                0);
    
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                          480,
                                          480,
                                          kCVPixelFormatType_4444AYpCbCr16,
                                          NULL,
                                          &destBuffer);
    
    NSParameterAssert(status == kCVReturnSuccess && destBuffer != NULL);
    
    err = vImageBuffer_CopyToCVPixelBuffer(&sourceBuffer, &format, destBuffer, vformat, 0, kvImagePrintDiagnosticsToConsole);
    

NOTE: these are settings for 64 bit ProRes with Alpha - adjust for 32 bit.

Vukašin Manojlović
  • 3,717
  • 3
  • 19
  • 31
Paul-J
  • 42
  • 1
  • DON'T forget NB, NB, very NB - Memory Leak City 1. You must RELEASE CG objects yourself ( colorspace etc ) – Paul-J Nov 01 '19 at 19:47
  • 2. You MUST also free(whatEver.data) from your buffers. None of these functions support ARC and if you are dealing with many video frames, you will soon run to a grinding halt. – Paul-J Nov 01 '19 at 19:56
  • Hi Paul, can you please answer my SO question (https://stackoverflow.com/questions/60904676/need-help-in-screen-recording-a-part-of-the-screen-in-ios), it's related to this answer and I'm not sure how to do it in swift. – Felix Marianayagam Mar 28 '20 at 19:46