0

I'm using AVCaptureVideoPreviewLayer to capture a barcode. Everything works just fine until I try to display the AVCaptureVideoPreviewLayer on an iPad running iPadOS 13 beta 4 while my app/scene is only using part of the iPad screen (the rest of the screen contains another app or another scene of my own app). In this case, the AVCaptureVideoPreviewLayer only shows black.

I've done a lot of searching and I've looked through the relevant documentation and I can't find anything about needing to request exclusive access to the related capture device or capture session.

If I have the AVCaptureVideoPreviewLayer displayed in my app, I can go between seeing the proper preview and it becoming black or frozen by adding/removing another multitasking app such that my own app goes between using the entire iPad screen and using only half of the screen.

This is either an iPadOS bug or I'm missing some request for exclusive access or something similar. The only thing I found remotely related is the lockForConfiguration method of AVCaptureDevice but since I'm not actually changing any configuration, it's not what is needed (and I tried using it and it made no difference).

You can replicate this problem by creating a single view iOS app with Xcode 11. Add a new Cocoa Touch class that extends UIViewController and fill its implementation with the code below. Add a button to the original view controller and add an action for the button that creates and presents the new view controller. You also need to add the NSCameraUsageDescription key to Info.plist.

#import "PreviewViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface PreviewViewController () <AVCaptureMetadataOutputObjectsDelegate>

@end

@implementation PreviewViewController {
    AVCaptureDevice *_videoCaptureDevice;
    AVCaptureSession *_captureSession;
    AVCaptureVideoPreviewLayer *_previewLayer;
    AVCaptureMetadataOutput *_metadataOutput;
}

- (void)deviceRotated {
    UIDeviceOrientation orient = [UIDevice currentDevice].orientation;

    switch (orient) {
        case UIDeviceOrientationPortrait:
            _previewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
            break;
        case UIDeviceOrientationPortraitUpsideDown:
            if ([self supportedInterfaceOrientations] & UIInterfaceOrientationMaskPortraitUpsideDown) {
                _previewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortraitUpsideDown;
            } else {
                _previewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
            }
            break;
        case UIDeviceOrientationLandscapeLeft:
            _previewLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
            break;
        case UIDeviceOrientationLandscapeRight:
            _previewLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeLeft;
            break;
    }
}

#pragma mark UIViewController methods

- (void)viewDidLoad {
    [super viewDidLoad];

    self.view.backgroundColor = UIColor.blackColor;

    _captureSession = [[AVCaptureSession alloc] init];

    _videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    NSError *error = nil;
    AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:_videoCaptureDevice error:&error];
    if (!videoInput) {
        NSLog(@"Np input: %@", error);
        // TODO - no video input
        return;
    }

    if ([_captureSession canAddInput:videoInput]) {
        [_captureSession addInput:videoInput];
    } else {
        NSLog(@"Can't add input");
        // TODO - now what?
        return;
    }

    _metadataOutput = [[AVCaptureMetadataOutput alloc] init];
    if ([_captureSession canAddOutput:_metadataOutput]) {
        [_captureSession addOutput:_metadataOutput];
    } else {
        NSLog(@"Can't add output");
        return;
    }

    _previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
    _previewLayer.frame = self.view.layer.bounds;
    _previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [self.view.layer addSublayer:_previewLayer];
}

- (void)viewWillLayoutSubviews {
    [super viewWillLayoutSubviews];

    _previewLayer.frame = self.view.layer.bounds;
}

- (void)viewWillAppear:(BOOL)animated {
    [super viewWillAppear:animated];

    [_metadataOutput setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];

    if (!_captureSession.isRunning) {
        // Attempted using lockForConfiguration on _videoCaptureDevice here but it made no difference
        [_captureSession startRunning];
    }

    if (_previewLayer.connection.supportsVideoOrientation) {
        [[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];

        [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(deviceRotated) name:UIDeviceOrientationDidChangeNotification object:nil];

        [self deviceRotated]; // set initial
    }
}

- (void)viewWillDisappear:(BOOL)animated {
    [super viewWillDisappear:animated];

    [_metadataOutput setMetadataObjectsDelegate:nil queue:nil];

    if (_captureSession.isRunning) {
        [_captureSession stopRunning];
    }
}

- (void)viewDidDisappear:(BOOL)animated {
    [super viewDidDisappear:animated];

    if (_previewLayer.connection.supportsVideoOrientation) {
        [[NSNotificationCenter defaultCenter] removeObserver:self name:UIDeviceOrientationDidChangeNotification object:nil];

        [[UIDevice currentDevice] endGeneratingDeviceOrientationNotifications];
    }
}

@end

Am I missing some needed code to gain exclusive access or might this be an iPadOS bug?

rmaddy
  • 314,917
  • 42
  • 532
  • 579
  • Is this problem new in iOS 13? Camera use has always been very limited when you're in multitasking mode. Otherwise you'd use too many resource when you are not the only frontmost app. Did you watch the Apple WWDC video on this topic? (Multitasking, not multiple windows; it was a couple of years ago.) I remember them saying something about this. – matt Sep 22 '19 at 00:51
  • @matt I don't know if this is something new in iOS 13. I don't have an iPad with iOS 12 to test this with and my app required full screen prior to the update I'm wrapping up for iOS 13 so I never had to deal with this until now and I just noticed this issue today. I'll see if I can find the video you are referring to. Thanks. – rmaddy Sep 22 '19 at 01:02
  • I don’t have an iOS 13 iPad. :) But look at what their Camera app does. I bet it is full screen only. – matt Sep 22 '19 at 01:13
  • @matt Ugh. I found the info in [WWDC 2015 session 211](https://developer.apple.com/videos/play/wwdc2015/211/). There's even the [AVCam sample app](https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/avcam_building_a_camera_app?language=objc#) added for iOS 13 that shows how to handle this. I'll post a full answer once I implement my solution. – rmaddy Sep 22 '19 at 01:33

0 Answers0