3

I'm captuting video with AVCaptureSession. But I would like to convert the captured image to an UIImage.

I found some code on Internet:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{

    NSLog(@"imageFromSampleBuffer: called");
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);


    // Free up the context and color space
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

But I got some errors:

Jan 17 17:39:25 iPhone-4-de-XXX ThinkOutsideTheBox[2363] <Error>: CGBitmapContextCreate: invalid data bytes/row: should be at least 2560 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst.
Jan 17 17:39:25 iPhone-4-de-XXX ThinkOutsideTheBox[2363] <Error>: CGBitmapContextCreateImage: invalid context 0x0
2013-01-17 17:39:25.896 ThinkOutsideTheBox[2363:907] image <UIImage: 0x1d553f00>
Jan 17 17:39:25 iPhone-4-de-XXX ThinkOutsideTheBox[2363] <Error>: CGContextDrawImage: invalid context 0x0
Jan 17 17:39:25 iPhone-4-de-XXX ThinkOutsideTheBox[2363] <Error>: CGBitmapContextGetData: invalid context 0x0

EDIT: I also use the UIImage to get the rgb color:

-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{

    UIImage* image = [self imageFromSampleBuffer:sampleBuffer];
    unsigned char* pixels = [image rgbaPixels];
    double totalLuminance = 0.0;
    for(int p=0;p<image.size.width*image.size.height*4;p+=4)
    {
        totalLuminance += pixels[p]*0.299 + pixels[p+1]*0.587 + pixels[p+2]*0.114;
    }
    totalLuminance /= (image.size.width*image.size.height);
    totalLuminance /= 255.0;
    NSLog(@"totalLuminance %f",totalLuminance);

}
Benoît Freslon
  • 2,021
  • 4
  • 26
  • 35
  • Per the warning, what are the values for the `width`, `height`, and `bytesPerRow` values when you receive the above errors? It sounds like your image buffer is being deallocated before you are trying to use it (possibly due to processing being done on separate threads). – Brad Larson Jan 17 '13 at 17:44
  • Also are you sure you're capturing RGB samples? The other types are more compact so would have a bytesPerRow less than that appropriate for that width of an RGB buffer. – Tommy Jan 17 '13 at 18:51
  • What log output do you get if you add `NSLog(@"Pixel format: 0x%x", CVPixelBufferGetPixelFormatType(imageBuffer));` in there? – Peter Hosey Jan 18 '13 at 06:07
  • @PeterHosey NSLog: Pixel format: 0x34323076 – Benoît Freslon Jan 18 '13 at 11:01
  • 1
    That's `'420v'`, which is `kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange`. Your buffer contents are not RGB pixels; they are described in [the documentation for that pixel format constant](http://developer.apple.com/library/ios/documentation/QuartzCore/Reference/CVPixelFormatDescriptionRef/Reference/reference.html#//apple_ref/c/econst/kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange). Attempting to create an RGB, ARGB, or RGBA context with this data will not work. – Peter Hosey Jan 18 '13 at 16:46
  • As Peter notes, you're going to need to either set your camera's output pixel buffer type to kCVPixelFormatType_32BGRA or perform some sort of conversion from the separate Y and UV planes in your YUV420p input to RGBA (or BGRA). I recommend the former, because CPU-bound conversion is really slow, and it's fairly complex to set up an OpenGL ES shader to do this. – Brad Larson Jan 18 '13 at 18:17

3 Answers3

7

Your best bet will be to set the capture video data output's videoSettings to a dictionary that specifies the pixel format you want, which you'll need to set to some variation on RGB that CGBitmapContext can handle.

The documentation has a list of all of the pixel formats that Core Video can process. Only a tiny subset of those are supported by CGBitmapContext. The format that the code you found on the internet is expecting is kCVPixelFormatType_32BGRA, but that might have been written for Macs—on iOS devices, kCVPixelFormatType_32ARGB (big-endian) might be faster. Try them both, on the device, and compare frame rates.

Peter Hosey
  • 95,783
  • 15
  • 211
  • 370
  • 1
    It seems `kCVPixelFormatType_32ARGB` is not supported in iOS。 AVCaptureOutput.h for a list of supported formats. Available pixel format types on this platform are (420v,420f,BGRA). – demon Jul 17 '14 at 09:40
  • @demon so if BGRA is one of the available formats then the matching key would be kCVPixelFormatType_32BGRA, and this key would be used to capture PNG? – Crashalot Apr 04 '16 at 18:50
  • @Crashalot first of all, your video must contain the alpha channel, or it's pointless. – demon Apr 05 '16 at 03:01
  • @demon yes understood, but you're saying use kCVPixelFormatType_32BGRA instead of kCVPixelFormatType_32ARGB? – Crashalot Apr 05 '16 at 03:06
3

You can try this code.

-(UIImage *) screenshotOfVideoStream:(CMSampleBufferRef)samImageBuff
  {
       CVImageBufferRef imageBuffer =  
         CMSampleBufferGetImageBuffer(samImageBuff);
       CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer];
       CIContext *temporaryContext = [CIContext contextWithOptions:nil];
       CGImageRef videoImage = [temporaryContext
                         createCGImage:ciImage
                         fromRect:CGRectMake(0, 0,
                         CVPixelBufferGetWidth(imageBuffer),
                         CVPixelBufferGetHeight(imageBuffer))];

       UIImage *image = [[UIImage alloc] initWithCGImage:videoImage];
       CGImageRelease(videoImage);
      return image;
}

It work for me.

jayprakash
  • 75
  • 9
0

In case anyone who are expecting jpeg image like me, there are simple API provided by Apple:

[AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:photoSampleBuffer];

and:

[AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer]
wzso
  • 3,482
  • 5
  • 27
  • 48