8

I'm working with AVCaptureVideoDataOutput and want to convert CMSampleBufferRef to UIImage. Many answers are the same, like this UIImage created from CMSampleBufferRef not displayed in UIImageView? and AVCaptureSession with multiple previews

It works fine if I set the VideoDataOutput color space to BGRA (credited to this answer CGBitmapContextCreateImage error)

NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[dataOutput setVideoSettings:videoSettings];

Without the above videoSettings, I will receive the following errors

CGBitmapContextCreate: invalid data bytes/row: should be at least 2560 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst.
<Error>: CGBitmapContextCreateImage: invalid context 0x0

Working with BGRA is not a good choice, since there is conversion overhead from YUV (default AVCaptureSession color space) to BGRA, as stated by Brad and Codo in How to get the Y component from CMSampleBuffer resulted from the AVCaptureSession?

So is there a way to convert CMSampleBufferRef to UIImage and working with YUV color space ?

Community
  • 1
  • 1
onmyway133
  • 45,645
  • 31
  • 257
  • 263

2 Answers2

9

After doing a lots of research and read apple documentations and wikipedis. I figured out the answer and it is works perfectly for me. So for the future readers Im sharing the code to convert CMSampleBufferRef to UIImage when video pixel type is set as kCVPixelFormatType_420YpCbCr8BiPlanarFullRange

// Create a UIImage from sample buffer data
// Works only if pixel format is kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
-(UIImage *) imageFromSamplePlanerPixelBuffer:(CMSampleBufferRef) sampleBuffer{

    @autoreleasepool {
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0);

        // Get the number of bytes per row for the plane pixel buffer
        void *baseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);

        // Get the number of bytes per row for the plane pixel buffer
        size_t bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer,0);
        // Get the pixel buffer width and height
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);

        // Create a device-dependent gray color space
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceGray();

        // Create a bitmap graphics context with the sample buffer data
        CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                     bytesPerRow, colorSpace, kCGImageAlphaNone);
        // Create a Quartz image from the pixel data in the bitmap graphics context
        CGImageRef quartzImage = CGBitmapContextCreateImage(context);
        // Unlock the pixel buffer
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);

        // Free up the context and color space
        CGContextRelease(context);
        CGColorSpaceRelease(colorSpace);

        // Create an image object from the Quartz image
        UIImage *image = [UIImage imageWithCGImage:quartzImage];

        // Release the Quartz image
        CGImageRelease(quartzImage);

        return (image);
    }
}
Bluewings
  • 3,438
  • 3
  • 18
  • 31
  • what about other colors? – JULIIncognito Jun 01 '16 at 16:18
  • @JULIIncognito chnage the color space to rgb – Bluewings Jun 01 '16 at 16:30
  • 1
    @Bluewings by changing the colorSpace to CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); it doesn't work... Is it working color image for you ? Thank – lilouch Jul 20 '16 at 09:19
  • @Bluewings by changing the colorSpace to CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(), the result is still grayscale image. Do you have any idea? – Linjie Mar 10 '17 at 00:50
  • @Linjie @lilouch what is your `AVCaptureVideoDataOutput` videoSettings dictionary values? – Bluewings Mar 21 '17 at 06:21
  • 1
    @Bluewings I figured out the problem. By using ```CGBitmapContextCreateImage``` it only extracts the Y channel so the result is always grayscale. I have to implement a function to manually convert YUV to RGB values following: http://stackoverflow.com/a/31553521/2298638 – Linjie Mar 21 '17 at 21:49
  • Best example on stack overflow. Thank you! – user3335999 May 25 '20 at 20:28
-1

// it works for me.

var image: CGImage?
VTCreateCGImageFromCVPixelBuffer(pixelBuffer, options: nil, imageOut: &image)
DispatchQueue.main.async {
    self.imageView.image = UIImage(cgImage: image!)
}
Jobs
  • 9
  • 5