5

i'm trying to take a picture from both cameras on an iOS device at the same time. I'ld also like to have a live preview of both cameras on screen. I use this code:

- (void)prepareCameraView:(UIView *)window
{
    NSArray *captureDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    {
        AVCaptureSession *session = [[AVCaptureSession alloc] init];
        session.sessionPreset = AVCaptureSessionPresetMedium;

        CALayer *viewLayer = window.layer;
        NSLog(@"viewLayer = %@", viewLayer);

        AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
        captureVideoPreviewLayer.frame = CGRectMake(0.0f, 0.0f, window.bounds.size.width/2.0f, window.bounds.size.height);
        [window.layer addSublayer:captureVideoPreviewLayer];

        NSError *error = nil;
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:[captureDevices objectAtIndex:0] error:&error];
        if (!input)
        {
            NSLog(@"ERROR : trying to open camera : %@", error);
        }

        [session addInput:input];

        [session startRunning];
    }

    {
        AVCaptureSession *session = [[AVCaptureSession alloc] init];
        session.sessionPreset = AVCaptureSessionPresetMedium;

        CALayer *viewLayer = window.layer;
        NSLog(@"viewLayer = %@", viewLayer);

        AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
        captureVideoPreviewLayer.frame = CGRectMake(window.bounds.size.width/2.0f, 0.0f, window.bounds.size.width/2.0f, window.bounds.size.height);
        [window.layer addSublayer:captureVideoPreviewLayer];

        NSError *error = nil;
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:[captureDevices objectAtIndex:1] error:&error];
        if (!input)
        {
            NSLog(@"ERROR : trying to open camera : %@", error);
        }

        [session addInput:input];

        [session startRunning];
    }

}

But when the app starts the session for the front camera, the session of the back camera stops and leaves a still image.

Is there a way to display output from both cameras live ?

Thanks

Abel
  • 315
  • 1
  • 4
  • 18

1 Answers1

2

No its not. At a time only one camera feed can be used when using AVCaptureSession.

Multiple AVCaptureInputs are not allowed simultaneously. So as soon as one session begins, the other will stop.

Your best bet would be to create two sessions, start the first and as soon as it reports a frame, stop it and start the second. Then stop the second and start the first, keep doing this. This will work but there will be noticeable latency in the inputs you receive.

lostInTransit
  • 70,519
  • 61
  • 198
  • 274
  • Ok, but is it possible to take a picture of both cameras without having a live preview ? – Abel Mar 16 '13 at 11:30
  • 2
    I think it is better to only have one session running all the time and switch camera with [AVsession beginConfiguration]; [AVsession addInput:inputCam]; [AVsession commitConfiguration]; You will still have some latency though – Sten Mar 16 '13 at 16:10
  • Although it might work, I can see such this solution being so slow, it is unusable. A better solution is to use a AVCaptureVideoDataOutput and draw the output using the AVCaptureVideoDataOutputSampleBufferDelegate. This solution is detailed here: http://stackoverflow.com/questions/16543075/avcapturesession-with-multiple-previews/25167597#25167597 – Johnny Aug 07 '14 at 18:24