3

Background:

I am using AVCaptureVideoDataOutput in alignment with AVCaptureSession and other various AV tools to create a video session such that I can create a camera session. I have a live feed of what the camera sees it is on the screen.

I use AVCaptureVideoDataOutput and not AVCaptureMovieFileOutput because the images I obtain through the live feed are processed using CIFilter's. Now, I want to record what is being shown to the user when I press a button. My thoughts on doing this were using the below function as I thought this function captures every frame. I infer this from Apple's Page which states:

Delegates receive this message whenever the output captures and outputs a new video frame, decoding or re-encoding it as specified by its videoSettings property. Delegates can use the provided video frame in conjunction with other APIs for further processing.

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)

I expected each frame to be here such that I could use an AVAssetWriter and add the buffer to the videoWriterInput. This can be seen here where the answer indicates that the method func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) adds the sampleBuffer to the videoWriterInput each time.

My Efforts:

I have attempted to simulate the above SO post where the answer utilizes AVAssetWriter to write AVCaptureVideoDataOutput to a file. However, my code, which is indicated below, is not calling the AVCaptureVideoDataOutputSampleBufferDelegate func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) delegate method.

NOTE: This is not all the code - I am adding only relevant portions - if there is something missing that you need, please let me know

VideoCapture.swift

class VideoCapture: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {

    private let captureSession = AVCaptureSession()
    private let videoDataOutput = AVCaptureVideoDataOutput()
    private let dataOutputQueue = DispatchQueue(label: "com.Camera.dataOutputQueue")
    private var videoConnection: AVCaptureConnection!

    //NEVER GETS CALLED
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        print("Buf: \(sampleBuffer)")
    }

    func captureOutput(_ output: AVCaptureOutput, didDrop sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        print("Drop Buff: \(sampleBuffer)")
    }

    init() {
        //Some Setup
        captureSession.sessionPreset = AVCaptureSession.Preset.high

        //...
        do {
            // video output
            videoDataOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)]
            videoDataOutput.alwaysDiscardsLateVideoFrames = true
            videoDataOutput.setSampleBufferDelegate(self, queue: dataOutputQueue)
            guard captureSession.canAddOutput(videoDataOutput) else { fatalError() }
            captureSession.addOutput(videoDataOutput)
            videoConnection = videoDataOutput.connection(with: .video)
        }
        //...
    }
}

AnotherFile.swift

class VC: UIViewController {

    private var videoCapture: VideoCapture!

    init() {
        self.videoCapture = VideoCapture()
    }

    public override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        guard let videoCapture = videoCapture else {return}
        videoCapture.startCapture()
    }
}

What I Expected:

I expect the method

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)

to be called and at least print out the buffer from the above example. This is not happening, so I am unable to further my development and insert my buffer into a videoWriteInput for recording.

What Actually Happens:

It is never called. I made sure to setup the delegates using videoDataOutput.setSampleBufferDelegate(self, queue: dataOutputQueue) and clearly indicate that the delegate methods are made. I was sure to use the auto-complete feature so that the method was created from XCode so I didn't mess up the method name such as this SO post.

Question:

How can I appropriately get the method to be called -- assuming my intuition is correct that this method is called for each frame and is a buffer I can insert into my videoWriterInput -- so that I can record a video from the AVCaptureSession I can see on screen?

Noteworthy:

This project DOES NOT work in terms of calling

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)

while this DOES work.

EDIT:

I have found out that for some reason, AVCaptureDataOutputSynchronizer causes it to not call the Delegate function. Any Ideas?

impression7vx
  • 1,728
  • 1
  • 20
  • 50

1 Answers1

3

Alrighty, I have found out the rationale for my torture. So, I was using AVCaptureMetadataOutput, AVCaptureDepthDataOutput, AVCaptureAudioDataOutput, and AVCaptureVideoDataOutput and combined them using AVCaptureDataOutputSynchronizer.

Since I am using AVCaptureDataOutputSynchronizer, it is capturing all the delegate calls inside here. For example, I can synchronize my data and it will call the AVCaptureDataOutputSynchronizer Delegate method instead of the other individual delegates.

dataOutputSynchronizer = AVCaptureDataOutputSynchronizer(dataOutputs: [videoDataOutput, depthDataOutput, metadataOutput])
dataOutputSynchronizer.setDelegate(self, queue: dataOutputQueue)

//... Some Code Later...

func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) {

    guard let syncedVideoData = synchronizedDataCollection.synchronizedData(for: videoDataOutput) as? AVCaptureSynchronizedSampleBufferData else { return }
    guard !syncedVideoData.sampleBufferWasDropped else {
        print("dropped video:\(syncedVideoData)")
        return
    }
    let videoSampleBuffer = syncedVideoData.sampleBuffer
    print(videoSampleBuffer)

    let syncedDepthData = synchronizedDataCollection.synchronizedData(for: depthDataOutput) as? AVCaptureSynchronizedDepthData
    var depthData = syncedDepthData?.depthData
    if let syncedDepthData = syncedDepthData, syncedDepthData.depthDataWasDropped {
        print("dropped depth:\(syncedDepthData)")
        depthData = nil
    }

    // 顔のある位置のしきい値を求める
    let syncedMetaData = synchronizedDataCollection.synchronizedData(for: metadataOutput) as? AVCaptureSynchronizedMetadataObjectData
    var face: AVMetadataObject? = nil
    if let firstFace = syncedMetaData?.metadataObjects.first {
        face = videoDataOutput.transformedMetadataObject(for: firstFace, connection: videoConnection)
    }
    guard let imagePixelBuffer = CMSampleBufferGetImageBuffer(videoSampleBuffer) else { fatalError() }
}

This gets called every frame as expected and I can get the individual sets of data, Audio, Video, etc and do what I please with them.

impression7vx
  • 1,728
  • 1
  • 20
  • 50
  • Did you manage to record audio via `AVCaptureAudioDataOutput`? I see you mentioned it but there is no example in your provided source code. – mixtly87 Apr 02 '21 at 21:52
  • I had to record it in it's own object and have its own delegate functions. I can add the source code to this answer for that solution. It might take some time. – impression7vx Apr 02 '21 at 21:54
  • If you find the time please provide some details on that on the example. I'm struggling with recording audio via `dataOutputSynchronizer` callback. – mixtly87 Apr 02 '21 at 22:09
  • Yeah - i’d recommend not recording audio with the synchronizer. Put the audio on its own thread – impression7vx Apr 02 '21 at 22:40