3

I need to obtain the UIImage from uncompressed image data from CMSampleBufferRef. I'm using the code:

captureStillImageOutput captureStillImageAsynchronouslyFromConnection:connection
 completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) 
{
    // that famous function from Apple docs found on a lot of websites
    // does NOT work for still images
    UIImage *capturedImage = [self imageFromSampleBuffer:imageSampleBuffer]; 
}

http://developer.apple.com/library/ios/#qa/qa1702/_index.html is a link to imageFromSampleBuffer function.

But it does not work properly. :(

There is a jpegStillImageNSDataRepresentation:imageSampleBuffer method, but it gives the compressed data (well, because JPEG).

How can I get UIImage created with the most raw non-compressed data after capturing Still Image?

Maybe, I should specify some settings to video output? I'm currently using those:

captureStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
captureStillImageOutput.outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };

I've noticed, that output has a default value for AVVideoCodecKey, which is AVVideoCodecJPEG. Can it be avoided in any way, or does it even matter when capturing still image?

I found something there: Raw image data from camera like "645 PRO" , but I need just a UIImage, without using OpenCV or OGLES or other 3rd party.

Community
  • 1
  • 1
dreamzor
  • 5,795
  • 4
  • 41
  • 61

2 Answers2

6

The method imageFromSampleBuffer does work in fact I'm using a changed version of it, but if I remember correctly you need to set the outputSettings right. I think you need to set the key as kCVPixelBufferPixelFormatTypeKey and the value as kCVPixelFormatType_32BGRA.

So for example:

NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;                                 
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];                
NSDictionary* outputSettings = [NSDictionary dictionaryWithObject:value forKey:key];

[newStillImageOutput setOutputSettings:outputSettings];

EDIT

I am using those settings to take stillImages not video. Is your sessionPreset AVCaptureSessionPresetPhoto? There may be problems with that

AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
[newCaptureSession setSessionPreset:AVCaptureSessionPresetPhoto];

EDIT 2

The part about saving it to UIImage is identical with the one from the documentation. That's the reason I was asking for other origins of the problem, but I guess that was just grasping for straws. There is another way I know of, but that requires OpenCV.

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

CVPixelBufferLockBaseAddress(imageBuffer, 0);

void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);



// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                             bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);


// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);

// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];

// Release the Quartz image
CGImageRelease(quartzImage);

return (image);

}

I guess that is of no help to you, sorry. I don't know enough to think of other origins for your problem.

thomketler
  • 418
  • 4
  • 9
  • Do you use the `imageFromSimpleBuffer` for getting data from still images, not video? If yes, can you add your changed implementation to your answer, please? My settings are exactly the same as yours, look the second code block in the question :) – dreamzor Mar 27 '13 at 17:08
  • I'm sorry I should have read your question more carefully. I edited my answer. – thomketler Mar 28 '13 at 10:11
  • Thanks, of course it is such preset etc. Can you please post your `imageFromSimpleBuffer` implementation so I can try it out? – dreamzor Mar 28 '13 at 18:13
  • Sorry, I don't think that will help. – thomketler Apr 02 '13 at 08:29
  • 1
    It worked! But it has its orientation wrong :( But I will figure it out myself, thanks! – dreamzor Apr 02 '13 at 09:54
  • Can anyone please help with some sample project or sample code. I don't seem to make it work. I can make a repo with what i currently have but at least replay so i know that someone sees this. – Bogus Apr 28 '16 at 18:52
1

Here's a more efficient way:

UIImage *image = [UIImage imageWithData:[self imageToBuffer:sampleBuffer]];

- (NSData *) imageToBuffer:(CMSampleBufferRef)source {
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

    NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    return data;
}
James Bush
  • 1,485
  • 14
  • 19