I'm trying to record captured frames as video while performing image processing tasks on the frames at the same time in parallel.
I have a single AVCaptureSession which I have added two separate outputs to -
- AVCaptureVideoDataOutput
- AVCaptureMovieFileOutput
I confirmed to both AVCaptureVideoDataOutputSampleBufferDelegate and AVCaptureFileOutputRecordingDelegate
I am using captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
for frame capture and analyze
and func fileOutput(_ output: AVCaptureFileOutput, didStartRecordingTo fileURL: URL, from connections: [AVCaptureConnection])
for video recording
For some reason, each method works separately, but when I'm adding both outputs, only the video recording works and the "captureOutput" function is not being called at all.
Any thoughts why this is happening, what am I doing wrong? or what should I make sure while setting up and configuring the session?