0

I got this code from another stackoverflow post which seems to excactly do what I want to do:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{

    CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,
                                                                 sampleBuffer, kCMAttachmentMode_ShouldPropagate);
    NSDictionary *metadata = [[NSMutableDictionary alloc]
                              initWithDictionary:(__bridge NSDictionary*)metadataDict];
    CFRelease(metadataDict);
    NSDictionary *exifMetadata = [[metadata
                                   objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
    float brightnessValue = [[exifMetadata
                              objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];
    NSLog(@"AVCapture: %f", brightnessValue);
}

But since I don't know too much about AVFoundation I don't know how to use it... How do I get the AVCaptureOutput, CMSampleBufferRef & AVCaptureConnection objects?

Or in other words "how do I set up a video input using the AVFoundation framework"?

Community
  • 1
  • 1
Bergrebell
  • 4,263
  • 4
  • 40
  • 53
  • Your posted method is a delegate call from the AVFoundation framework (Sample Buffer Delegate). So you don't have to get anything here. If you are trying to setup a CaptureSession where the Sample Buffer Delegate is called you can easily find the code in the AVFoundation Programming Guide: https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html – Daniel Albertini Aug 17 '15 at 15:06
  • what do you mean by that? maybe in simple terms - how do I get the float brightnessValue? I honestly don't understand your answer ... – Bergrebell Aug 17 '15 at 15:15

1 Answers1

0

The following code should help you to setup a CaptureSession with a SampleBufferDelegate:

AVCaptureSession *session = [[AVCaptureSession alloc] init];

AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];

AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];

dispatch_queue_t queue = dispatch_queue_create("VideoQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];

NSDictionary *outputSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                                           forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[output setVideoSettings:outputSettings];

output.alwaysDiscardsLateVideoFrames = YES;

Reference for more information: https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/04_MediaCapture.html