22

I am trying to use the new AVFoundation framework for taking still pictures with the iPhone.

With a button press this methos is called. I can hear the shutter sound but I can't see the log output. If I call this method several times the camera preview will freeze.

Is there any tutorial out there how to use captureStillImageAsynchronouslyFromConnection?

[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:
                [[self stillImageOutput].connections objectAtIndex:0]
                     completionHandler:^(CMSampleBufferRef imageDataSampleBuffer,
                            NSError *error) {
                                              NSLog(@"inside");
                            }];
- (void)initCapture {
    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput 
                                          deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] 
                                          error:nil];

    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];

    captureOutput.alwaysDiscardsLateVideoFrames = YES; 

    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", NULL);
    [captureOutput setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
    [captureOutput setVideoSettings:videoSettings]; 

    self.captureSession = [[AVCaptureSession alloc] init];
    self.captureSession.sessionPreset = AVCaptureSessionPresetLow;

    [self.captureSession addInput:captureInput];
    [self.captureSession addOutput:captureOutput];

    self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];

    [self.prevLayer setOrientation:AVCaptureVideoOrientationLandscapeLeft];

    self.prevLayer.frame = CGRectMake(0.0, 0.0, 480.0, 320.0);
    self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;

    [self.view.layer addSublayer: self.prevLayer];


    // Setup the default file outputs
    AVCaptureStillImageOutput *_stillImageOutput = [[[AVCaptureStillImageOutput alloc] init] autorelease];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
                                    AVVideoCodecJPEG, AVVideoCodecKey,
                                    nil];
    [_stillImageOutput setOutputSettings:outputSettings];
    [outputSettings release];
    [self setStillImageOutput:_stillImageOutput];   

    if ([self.captureSession canAddOutput:stillImageOutput]) {
        [self.captureSession addOutput:stillImageOutput];
    }

    [self.captureSession commitConfiguration];
    [self.captureSession startRunning];

}
Ravi Gautam
  • 960
  • 2
  • 9
  • 20
dan
  • 1,067
  • 3
  • 11
  • 17
  • I don't know if this was true back in 2010, but as of late 2011, I have easily been using `captureStillImageAsynchronouslyFromConnection` at the same time as getting a video feed using AVCaptureVideoDataOutput's delegate, `captureOutput`and while using a previewLayer. I get a 5megapixel image from stillImage, an 852x640 for the video feed and previewLayer. – mahboudz Sep 20 '12 at 08:48
  • The above comment was based on an iPhone4. On an iPhone 4s, I get an 8megapixel still, while getting 852x640 for the video feed and previewLayer. – mahboudz Sep 20 '12 at 18:36
  • What's the difference between initWithSession and layerWithSession? The docs don't discuss when to use one vs the other. http://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVCaptureVideoPreviewLayer_Class/Reference/Reference.html – knite Oct 11 '12 at 07:02

5 Answers5

62

After a lot of trial and error, I worked out how to do this.

Hint: Apple's official docs are - simply - wrong. The code they give you doesn't actually work.

I wrote it up here with step-by-step instructions:

http://red-glasses.com/index.php/tutorials/ios4-take-photos-with-live-video-preview-using-avfoundation/

Lots of code on the link, but in summary:

-(void) viewDidAppear:(BOOL)animated
{
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    session.sessionPreset = AVCaptureSessionPresetMedium;

    CALayer *viewLayer = self.vImagePreview.layer;
    NSLog(@"viewLayer = %@", viewLayer);

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

    captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
    [self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];

    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!input) {
        // Handle the error appropriately.
        NSLog(@"ERROR: trying to open camera: %@", error);
    }
    [session addInput:input];

stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];

[session addOutput:stillImageOutput];

    [session startRunning];
}

-(IBAction) captureNow
{
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillImageOutput.connections)
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] )
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) { break; }
    }

    NSLog(@"about to request a capture from: %@", stillImageOutput);
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
    {
         CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
         if (exifAttachments)
         {
            // Do something with the attachments.
            NSLog(@"attachements: %@", exifAttachments);
         }
        else
            NSLog(@"no attachments");

        NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
        UIImage *image = [[UIImage alloc] initWithData:imageData];

        self.vImage.image = image;
     }];
}
Adam
  • 32,900
  • 16
  • 126
  • 153
  • 4
    I can't +1 this enough. Thank you for sharing this! – Art Gillespie Apr 19 '12 at 20:32
  • Tscott (in the comment below) has a working example of Adam's code. I downloaded and installed it from Github and initialliy it didn't work on my iPad. After checking the xib files though, I saw that he had the iPhone xib file outlets connected to the code, but that the iPad xib was blank. After adding the imageview, labels, buttons, and other objects to the iPad xib file I was able to get the code to run. – Miriam P. Raphael Nov 11 '12 at 15:13
  • Updated link to the tutorial : http://red-glasses.com/index.php/tutorials/ios4-take-photos-with-live-video-preview-using-avfoundation/ – Ríomhaire Apr 24 '13 at 11:59
  • Wordpress sucks :( - upgrading wordpress kills the site, without reporting any errors. Fixed now. – Adam Feb 26 '14 at 17:09
  • 1
    I know I'm late to the party, but those two `for` loops you use to get the video connection can be substitued by `[stillImageOutput connectionWithMediaType: AVMediaTypeVideo]` (I'm no expert, but at least it works for me...). Anyway, this example was a life-saver! Thank you. – Alex Apr 13 '14 at 21:56
  • I was so confused, why my `AVCaptureStillIMageOutput` has no connections..Turns out because I initialized `AVCaptureSession` in `viewWillAppear`, where it should have been in `viewDidAppear`.. Silly me. Costs me an entire day. Thanks for the answer! – dvdchr Feb 26 '15 at 07:30
  • The link in this answer returns a page that says "Error establishing a database connection". – halfer Nov 03 '19 at 09:25
16

We had this problem when 4.0 was still in beta. I tried a fair bunch of things. Here goes:

  • AVCaptureStillImageOutput and AVCaptureVideoDataOutput do not appear to play nicely with each other. If the video output is running, the image output never seems to complete (until you pause the session by putting the phone to sleep; then you seem to get a single image out).
  • AVCaptureStillImageOutput only seems to work sensibly with AVCaptureSessionPresetPhoto; otherwise you effectively get JPEG-encoded video frames. Might as well use higher-quality BGRA frames (incidentally, the camera's native output appears to be BGRA; it doesn't appear to have the colour subsampling of 2vuy/420v).
  • The video (everything that isn't Photo) and Photo presets seem fundamentally different; you never get any video frames if the session is in photo mode (you don't get an error either). Maybe they changed this...
  • You can't seem to have two capture sessions (one with a video preset and a video output, one with Photo preset and an image output). They might have fixed this.
  • You can stop the session, change the preset to photo, start the session, take the photo, and when the photo completes, stop, change the preset back, and start again. This takes a while and the video preview layer stalls and looks terrible (it re-adjusts exposure levels). This also occasionally deadlocked in the beta (after calling -stopRunning, session.running was still YES).
  • You might be able to disable the AVCaptureConnection (it's supposed to work). I remember this deadlocking; they may have fixed this.

I ended up just capturing video frames. The "take picture" button simply sets a flag; in the video frame callback, if the flag is set, it returns the video frame instead of a UIImage*. This was sufficient for our image-processing needs — "take picture" exists largely so the user can get a negative response (and an option to submit a bug report); we don't actually want 2/3/5 megapixel images, since they take ages to process.

If video frames are not good enough (i.e. you want to capture viewfinder frames between high-res image captures), I'd first see whether they've fixed using multiple AVCapture sessions, since that's the only way you can set both presets.

It's probably worth filing a bug. I filed a bug around the launch of 4.0 GM; Apple asked me for some sample code, but by then I'd decided to use the video frame workaround and had a release to release.

Additionally, the "low" preset is very low-res (and results in a low-res, low-framerate video preview). I'd go for 640x480 if available, falling back to Medium if not.

tc.
  • 33,468
  • 5
  • 78
  • 96
6

This has been a huge help - I was stuck in the weeds for quite a while trying to follow the AVCam example.

Here is a complete working project with my comments that explain what is happening. This illustrates how you can use the capture manager with multiple outputs. In this example there are two outputs.

The first is the still image output of the example above.

The second provides frame by frame access to the video coming out of the camera. You can add more code to do something interesting with the frames if you like. In this example I am just updating a frame counter on the screen from within the delegate callback.

https://github.com/tdsltm/iphoneStubs/tree/master/VideoCamCaptureExample-RedGlassesBlog/VideoCamCaptureExample-RedGlassesBlog

Tereus Scott
  • 674
  • 1
  • 6
  • 11
  • Worked great. Just a heads up to anyone that downloads this code from Gitub -- the iPad xib file is blank. You need to copy the iPhone xib objects and copy/paste them to the iPad xib files. Then connect the objects to their respective outlets. – Miriam P. Raphael Nov 11 '12 at 15:14
  • Does not work for me. Not in iphone 5S with beta8, not on ipad with IOS 7.1.2. – Tõnu Samuel Sep 03 '14 at 05:56
3

Apple has some notes and example code on this:

Technical Q&A QA1702: How to capture video frames from the camera as images using AV Foundation

Michael Grinich
  • 4,770
  • 8
  • 29
  • 30
0

You should use Adam's answer, but if you use Swift (like most of you probably do nowadays), here's a Swift 1.2 port of his code:

  1. Make sure you import ImageIO
  2. Add a property private var stillImageOutput: AVCaptureStillImageOutput!
  3. Instantiate stillImageOutput before captureSession.startRunning():

Like this:

stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
captureSession.addOutput(stillImageOutput)

Then use this code to capture an image:

private func captureImage() {
    var videoConnection: AVCaptureConnection?
    for connection in stillImageOutput.connections as! [AVCaptureConnection] {
        for port in connection.inputPorts {
            if port.mediaType == AVMediaTypeVideo {
                videoConnection = connection
                break
            }
        }
        if videoConnection != nil {
            break
        }
    }
    print("about to request a capture from: \(stillImageOutput)")
    stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) { (imageSampleBuffer: CMSampleBuffer!, error: NSError!) -> Void in
        let exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, nil)
        if let attachments = exifAttachments {
            // Do something with the attachments
            print("attachments: \(attachments)")
        } else {
            print("no attachments")
        }
        let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer)
        let image = UIImage(data: imageData)
        // Do something with the image
    }
}

This all assumes that you already have an AVCaptureSession setup and just need to take a still from it, as did I.

RyJ
  • 3,995
  • 6
  • 34
  • 54