I'm working on getting a camera preview working on 2 different view controllers, which is necessary as a workaround for another SDK I'm using in this app that requires a separate app delegate.
Basically, I've got a camera preview that displays in a UIView that loads on the viewDidLoad method on startup no problem. Then I need to push a button and load a another camera preview that can read and process QR codes. I have separate xibs for the two views, and the child view controller comes up no problem, but the camera preview never loads, so scanning QR codes is pretty hard.
Here's what I've got:
This is the viewDidLoad for the parent ViewController. The camera preview works no problem.
- (void)viewDidLoad {
[super viewDidLoad];
[DRDouble sharedDouble].delegate = self;
NSLog(@"SDK Version: %@", kDoubleBasicSDKVersion);
//capture live preview
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = cameraPreview.layer;
NSLog(@"viewLayer = %@", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = cameraPreview.bounds;
[cameraPreview.layer addSublayer:captureVideoPreviewLayer];
NSArray *possibleDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *device = [possibleDevices lastObject];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(@"ERROR: trying to open camera: %@", error);
}
[session addInput:input];
[session startRunning];
}
Then I call the child ViewController with this button
- (IBAction)switchScanView {
[session stopRunning];
[session release];
[self presentViewController:[[ViewController alloc] init] animated:true completion:nil];
}
And here's the viewDidLoad from the child View Controller
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!input) {
NSLog(@"%@", [error localizedDescription]);
return NO;
}
_captureSession = [[AVCaptureSession alloc] init];
[_captureSession addInput:input];
AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];
dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create("myQueue", NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:[NSArray arrayWithObject:AVMetadataObjectTypeQRCode]];
viewPreview view's layer.
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_mviewPreview.layer.bounds];
[_mviewPreview.layer addSublayer:_videoPreviewLayer];
// Start video capture.
[_captureSession startRunning];
I suspect it has something to do with running an AVCaptureSession on two different controllers, but you can see that in the button I have called both [stopRunning] and [release] on the AVCaptureSession object and it's still not coming up.
I'm at a loss of what else to try, does anyone see what is happening?