I'm capturing an image using AVFoundation
. I'm using AVCaptureVideoPreviewLayer
to display the camera feed on screen. This preview layer's frame gets the bounds of a UIView
with dynamic dimensions:
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CALayer *rootLayer = [self.cameraFeedView layer];
[rootLayer setMasksToBounds:YES];
CGRect frame = self.cameraFeedView.frame;
[previewLayer setFrame:frame];
previewLayer.frame = rootLayer.bounds;
[rootLayer insertSublayer:previewLayer atIndex:0];
And I'm using AVCaptureStillImageOutput
to capture an image:
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *capturedImage = [UIImage imageWithData:imageData];
}
}];
My problem is that the captured image is at the size of the iPhone camera (1280x960 - front camera), but I need it to be the same aspect ratio as the preview layer. For example, if the size of the preview layer is 150x100, I need the captured image to be 960x640. Is there any solution for this?