10

I have been looking into this for way too long now.

I am trying to get MacOS webcam data and run CIDetect on the frames that the webcam outputs.

I know I need to:

  • connect AVCaptureDevice (as in input to) into AVCaptureSession

  • connect AVCaptureVideoDataOutput (as an output to) into AVCaptureSession

  • call .setSampleBufferDelegate(AVCaptureVideoDataOutputSampleBufferDelegate, DelegateQueue)

For some reason, after calling .setSampleBufferDelegate(...) (and of course after calling .startRunning() on the AVCaptureSession instance), my AVCaptureVideoDataOutputSampleBufferDelegate's captureOutput is not being called.

I found so many people having trouble with this online, but I was not able to find any solution.

It seems to me like it has to do with the DispatchQueue.

MyDelegate.swift:

class MyDelegate : NSObject {


    var context: CIContext?;
    var detector : CIDetector?;

    override init() {
        context = CIContext();
        detector = CIDetector(ofType: CIDetectorTypeFace, context: context);
        print("set up!");

    }

}
extension MyDelegate : AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection) {
        print("success?");
        var pixelBuffer : CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!;
        var image : CIImage = CIImage(cvPixelBuffer: pixelBuffer);
        var features : [CIFeature] = detector!.features(in: image);
        for feature in features {
            print(feature.type);
            print(feature.bounds);
        }
    }

    func captureOutput(_ : AVCaptureOutput, didDrop sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection) {
        print("fail?");
    }
}

ViewController.swift:

var captureSession : AVCaptureSession;
var captureDevice : AVCaptureDevice?
var previewLayer : AVCaptureVideoPreviewLayer?

var vdo : AVCaptureVideoDataOutput;

var videoDataOutputQueue : DispatchQueue;

override func viewDidLoad() {
    super.viewDidLoad()

    camera.layer = CALayer()

    // Do any additional setup after loading the view, typically from a nib.
    captureSession.sessionPreset = AVCaptureSessionPresetLow

    // Get all audio and video devices on this machine
    let devices = AVCaptureDevice.devices()

    // Find the FaceTime HD camera object
    for device in devices! {
        print(device)

        // Camera object found and assign it to captureDevice
        if ((device as AnyObject).hasMediaType(AVMediaTypeVideo)) {
            print(device)
            captureDevice = device as? AVCaptureDevice
        }
    }

    if captureDevice != nil {
        do {   
            try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice))
            // vdo : AVCaptureVideoDataOutput;
            vdo.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable: NSNumber(value: kCVPixelFormatType_32BGRA)]

            try captureDevice!.lockForConfiguration()
            captureDevice!.activeVideoMinFrameDuration = CMTimeMake(1, 30)
            captureDevice!.unlockForConfiguration()

            videoDataOutputQueue.sync{
                vdo.setSampleBufferDelegate(
                    MyDelegate,
                    queue: videoDataOutputQueue
                );
                vdo.alwaysDiscardsLateVideoFrames = true
                captureSession.addOutput(vdo)   
                captureSession.startRunning();
            }
        } catch {
            print(AVCaptureSessionErrorKey.description)
        }
    }

All of the necessary variables inside viewDidLoad relating to AVFoundation have been instantiated inside the Viewcontroller's init(). I've omitted that for clarity.

Any ideas?

Thanks, SO!

Kovek

EDIT: - Fixed setting delegate from self to MyDelegate.

And this is how I initialize videoDataOutputQueue:

    videoDataOutputQueue = DispatchQueue(
        label: "VideoDataOutputQueue"   
    );
Slackware
  • 960
  • 1
  • 13
  • 29

3 Answers3

12

I had a similar problem: in my case the problem was that writing in Swift 4 you have to implement the following method:

func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) 

instead of:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!)

Hope it helps.

EDIT

This method has to be implemented by the AVCaptureMetadataOutputObjectsDelegate (e.g., your viewcontroller). In order to start QRCode capture session you can try something like this:

    captureSession = AVCaptureSession()

    let videoCaptureDevice = AVCaptureDevice.default(for: AVMediaType.video);
    var videoInput:AVCaptureDeviceInput? =  nil;

    do {
        if let v = videoCaptureDevice{
            videoInput = try AVCaptureDeviceInput(device: v)
        }
        else{
            print("Error: can't find videoCaptureDevice");
        }

    } catch {
        let ac = UIAlertController(title: "Error", message: error.localizedDescription, preferredStyle: .alert)
        ac.addAction(UIAlertAction(title: "Ok", style: .default))
        present(ac, animated: true)
        return
    }

    if let videoInput = videoInput{
        if (captureSession.canAddInput(videoInput)) {
            captureSession.addInput(videoInput)
        } else {
            //Show error
            return;
        }
    }
    else{
        //Show error
        return;
    }

    let metadataOutput = AVCaptureMetadataOutput()

    if (captureSession.canAddOutput(metadataOutput)) {
        captureSession.addOutput(metadataOutput);

        metadataOutput.setMetadataObjectsDelegate(/*YOUR DELEGATE*/, queue: DispatchQueue.main);
        metadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.qr, AVMetadataObject.ObjectType.code128];
    } else {
        //Show error
        return;
    }

    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession);
    previewLayer.frame = view.layer.bounds;

    previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill;
    view.layer.addSublayer(previewLayer);

    captureSession.startRunning();
Andrea Gorrieri
  • 1,704
  • 2
  • 22
  • 36
9

You made a mistake in declaration of required sample buffer delegate method:

captureOutput(_:didOutputSampleBuffer:from:).

Please check it and make sure it is:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)

PS: Pay attention on how parameters of that method are declared. All parameters have '!' which means automatic unwrapping.

ninjaproger
  • 2,024
  • 1
  • 27
  • 26
  • 1
    my console right now: > success? > success? > success? ... Success! Thanks! As a followup, why does XCode not notify me that the declaration is not correct? – Slackware Jul 10 '17 at 16:52
  • XCode doesn't notify you because captureOutput(_:didOutputSampleBuffer:from:) is an optional method of AVCaptureVideoDataOutputSampleBufferDelegate protocol – ninjaproger Jul 10 '17 at 17:00
  • 1
    I am on 10.15.6 with swift 5.3. I have the same problem with Slackware, however, when I used the signature (with the auto unwrapped in the parameters) provided in this answer, swift compiler complained "Parameters of 'captureOutput(_:didOutput:from:)' have different optionality than expected by protocol 'AVCaptureVideoDataOutputSampleBufferDelegate'" Any idea? Look like Apple changes something again? – psksvp Sep 18 '20 at 01:14
  • Just a follow up from my previous comment, if I implement the protocol (AVCaptureVideoDataOutputSampleBufferDelegate) in an NSViewController derived class, the normal signature captureOutput(_:didOutput:from:) without the auto unwrapped parameters works just fine. It just won't work if I have a separate class derived from NSObject to implement the protocol like what Slackware did. – psksvp Sep 18 '20 at 01:23
1

In my case delegate method was not being called because AVCaptureMovieFileOutput was added to the session before 'AVCaptureVideoDataOutput'. I'm guessing that only one video releated output can be added to the session. Adding only AVCaptureVideoDataOutput solved the problem.

Kubba
  • 3,390
  • 19
  • 34