23

I'm trying to display a UIImage in real-time coming from the camera, and it seems that my UIImageView is not displaying the image properly. This is the method which a AVCaptureVideoDataOutputSampleBufferDelegate has to implement

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection
{ 
    // Create a UIImage from the sample buffer data
    UIImage *theImage = [self imageFromSampleBuffer:sampleBuffer];
//    NSLog(@"Got an image! %f %f", theImage.size.width, theImage.size.height);
//    NSLog(@"The image view is %@", imageView);
//    UIImage *theImage = [[UIImage alloc] initWithData:[NSData 
//        dataWithContentsOfURL:[NSURL 
//        URLWithString:@"http://farm4.static.flickr.com/3092/2915896504_a88b69c9de.jpg"]]];
    [self.session stopRunning];
    [imageView setImage: theImage];
}

To get the easy problems out of the way:

  • Using UIImagePickerController is not an option (eventually we will actually do things with the image)
  • I know the handler is being called (the NSLog calls are made, and I see the output)
  • I know I have the IBOutlet declarations set up correctly. If I use the commented code above to load an arbitrary image from the web instead of simply sending setImage:theImage to the imageView, the image is loaded correctly (and the second call to NSLog reports a non-nil object).
  • At least to a basic extent, the image I get from imageFromSampleBuffer: is fine, since NSLog reports the size to be 360x480, which is the size I expected.

The code I'm using is the recently-posted AVFoundation snippet from Apple available here.

In particular, that is the code I use which sets up the AVCaptureSession object and friends (of which I understand very little), and creates the UIImage object from the Core Video buffers (that's the imageFromSampleBuffer method).

Finally, I can get the application to crash if I try to send drawInRect: to a plain UIView subclass with the UIImage returned by imageFromSamplerBuffer, while it doesn't crash if I use an UIImage from a URL as above. Here is the stack trace from the debugger inside the crash (I get a EXC_BAD_ACCESS signal):

#0  0x34a977ee in decode_swap ()
#1  0x34a8f80e in decode_data ()
#2  0x34a8f674 in img_decode_read ()
#3  0x34a8a76e in img_interpolate_read ()
#4  0x34a63b46 in img_data_lock ()
#5  0x34a62302 in CGSImageDataLock ()
#6  0x351ab812 in ripc_AcquireImage ()
#7  0x351a8f28 in ripc_DrawImage ()
#8  0x34a620f6 in CGContextDelegateDrawImage ()
#9  0x34a61fb4 in CGContextDrawImage ()
#10 0x321fd0d0 in -[UIImage drawInRect:blendMode:alpha:] ()
#11 0x321fcc38 in -[UIImage drawInRect:] ()

EDIT: Here's some more information about the UIImage being returned by that bit of code.

Using the method described here, I can get to the pixels and print them, and they look ok at first glance (every value in the alpha channel is 255, for example). However, there's something slightly off with the buffer sizes. The image I get from Flickr from that URL is 375x500, and its [pixelData length] gives me 750000 = 375*500*4, which is the expected value. However, the pixel data of image returned from imageFromSampleBuffer: has size 691208 = 360*480*4 + 8, so there's 8 extra bytes in the pixel data. CVPixelBufferGetDataSize itself returns this off-by-8 value. I thought for a moment that it could be due to allocating buffers at aligned positions in memory, but 691200 is a multiple of 256, so that doesn't explain it either. This size discrepancy is the only difference I can tell between the two UIImages, and it could be causing the trouble. Still, there's no reason allocating extra memory for the buffer should cause a EXC_BAD_ACCESS violation.

Thanks a lot for any help, and let me know if you need more information.

Community
  • 1
  • 1
Carlos Scheidegger
  • 1,906
  • 1
  • 13
  • 18
  • I'm running into the same issue. What is irritating is that it was Apple itself that provided us with the custom "imageFromSampleBuffer" function. – Daniel Amitay Jul 22 '10 at 22:36
  • Even more strangely, the image content itself is correct. I'm pushing the raw pixels through a socket to a different machine and it is exactly the video frames captured from the camera in the right format. – Carlos Scheidegger Jul 22 '10 at 23:59

5 Answers5

41

I had the same problem ... but I found this old post, and its method of creating the CGImageRef works!

http://forum.unity3d.com/viewtopic.php?p=300819

Here's a working sample:

app has a member UIImage theImage;

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer  fromConnection:(AVCaptureConnection *)connection
{
        //... just an example of how to get an image out of this ...

    CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
    theImage.image =     [UIImage imageWithCGImage: cgImage ];
    CGImageRelease( cgImage );
}

- (CGImageRef) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer // Create a CGImageRef from sample buffer data
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    CVPixelBufferLockBaseAddress(imageBuffer,0);        // Lock the image buffer 

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);   // Get information of the image 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    CGImageRef newImage = CGBitmapContextCreateImage(newContext); 
    CGContextRelease(newContext); 

    CGColorSpaceRelease(colorSpace); 
    CVPixelBufferUnlockBaseAddress(imageBuffer,0); 
    /* CVBufferRelease(imageBuffer); */  // do not call this!

    return newImage;
}
Ken Pletzer
  • 484
  • 5
  • 3
  • Nice - I'll try this as soon as I finish one other thing, and I'll get back to you. – Carlos Scheidegger Jul 26 '10 at 14:41
  • 1
    So, I'm a complete idiot when it comes to the iOS API, and I don't know how to push the pixels into the actual CGImage. This seems to be achieved by the `memcpy(mImageData, ...)` call, but mImageData is actually not declared anywhere in that snippet. Do you know how I can get access to such a pointer with the iOS api? – Carlos Scheidegger Jul 29 '10 at 13:49
  • Ah, yes, I don't even need that - the CGImageRef had already been created. Thanks for that. Do you mind editing your answer to add that bit of code inline for future reference in StackOverflow? I'll give you the bounty in any case, but I think that will be most helpful. – Carlos Scheidegger Jul 29 '10 at 14:26
  • 13
    For some reason I'm getting : CGBitmapContextCreateImage: invalid context 0x0. Any ideas? – Tomas Andrle Jun 05 '12 at 21:38
  • Be aware. Transferring the pixels via a CGContextRef will not transfer the sample buffers metadata (i.e. EXIF data). – stigi May 15 '13 at 18:05
  • @TomA have you found the answer ? – onmyway133 Oct 11 '13 at 04:28
  • this is working in iphone5, ipad3 and ipad2. But not in iphone4(B & W). I mean image displaying but not in continuous manner. – Muruganandham K Nov 18 '13 at 04:25
  • IMPORTANT!: It works, but saving action should take place in the asynchronous dispatch on main queue. – Vanya Oct 24 '14 at 09:41
  • I'm also getting 'Invalid context 0x0' ... Any findings since then @TomA ? – Shai Mishali Dec 30 '14 at 12:03
  • same here, I am getting error ": CGBitmapContextCreateImage: invalid context 0x0. This is a serious error. This application, or a library it uses, is using an invalid context and is thereby contributing to an overall degradation of system stability and reliability. This notice is a courtesy: please fix this problem. It will become a fatal error in an upcoming update." Any ideas guys? – sudoExclaimationExclaimation Aug 09 '15 at 07:37
  • For those getting `: CGBitmapContextCreateImage: invalid context 0x0.` see [answer](http://stackoverflow.com/a/12973348/5152481) by @vladimir below! This was my solution. – gwest7 Jan 28 '16 at 09:58
  • I had to change it to `kCGBitmapByteOrder32Big` otherwise the RGB channels would be mixed up. – tettoffensive Jun 16 '16 at 18:46
  • For those getting : CGBitmapContextCreateImage: invalid context 0x0. Please check that captureOutput is an AVCaptureVideoDataOutput . – Nimrod Borochov Aug 21 '19 at 09:34
10

Live capture of video frames is now well explained by Apple's Technical Q&A QA1702:

https://developer.apple.com/library/ios/#qa/qa1702/_index.html

matt
  • 515,959
  • 87
  • 875
  • 1,141
7

It is also important to set right output format. I had a problem with image capturing when used default format settings. It should be:

[videoDataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey]];
Vladimir
  • 7,670
  • 8
  • 28
  • 42
5

Ben Loulier has a good write up on how to do this.

I am using his example app as a starting point, and it is working for me. Along with replacing the imageFromSamplerBuffer function with something that creates CGImageRef with CGBitmapContextCreate, he is using the main dispatch queue (via dispatch_get_main_queue()) when setting the output sample buffer delegate. This isn't the best solution because it needs a serial queue, and from what I understand the main queue is not a serial queue. So while you aren't guaranteed to get the frames in the correct order it seems to work for me so far :)

sroske
  • 191
  • 1
  • 3
  • This works, even without the different queue. I'm now suspicious of Apple's call to `CGDataProviderCreateWithData` in their code snippet, the only remaining difference in the code. – Carlos Scheidegger Jul 29 '10 at 14:22
1

Another thing to look out for is whether you're actually updating your UIImageView on the main thread: if you aren't, chances are it won't reflect any changes.

captureOutput:didOutputSampleBuffer:fromConnection delegate method is often called on a background thread. So you want to do this:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{   
    CFRetain(sampleBuffer);

    [[NSOperationQueue mainQueue] addOperationWithBlock:^{

        //Now we're definitely on the main thread, so update the imageView:
        UIImage *capturedImage = [self imageFromSampleBuffer:sampleBuffer];

        //Display the image currently being captured:
        imageView.image = capturedImage;

        CFRelease(sampleBuffer);
    }];
}
Eric
  • 16,003
  • 15
  • 87
  • 139