I'm working on a Swift-based macOS app where I need to capture video input, but not display it on the screen...rather than display the video, I want to send the buffered data for processing elsewhere, and eventually display it on an object in a SceneKit
scene.
I have a CameraInput
class that has a prepareCamera
method:
fileprivate func prepareCamera() {
self.videoSession = AVCaptureSession()
self.videoSession.sessionPreset = AVCaptureSession.Preset.photo
if let devices = AVCaptureDevice.devices() as? [AVCaptureDevice] {
for device in devices {
if device.hasMediaType(AVMediaType.video) {
cameraDevice = device
if cameraDevice != nil {
do {
let input = try AVCaptureDeviceInput(device: cameraDevice)
if videoSession.canAddInput(input) {
videoSession.addInput(input)
}
} catch {
print(error.localizedDescription)
}
}
}
}
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self as AVCaptureVideoDataOutputSampleBufferDelegate, queue: DispatchQueue(label: "sample buffer delegate", attributes: []))
if videoSession.canAddOutput(videoOutput) {
videoSession.addOutput(videoOutput)
}
}
}
And a startSession
method that starts the AVCaptureSession
session:
fileprivate func startSession() {
if let videoSession = videoSession {
if !videoSession.isRunning {
self.videoInputRunning = true
videoSession.startRunning()
}
}
}
I also implement AVCaptureVideoDataOutputSampleBufferDelegate
, where I intend to capture the CMSampleBuffer
for later use:
extension CameraInput: AVCaptureVideoDataOutputSampleBufferDelegate {
internal func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
print(Date())
}
}
However, the delegate is never called. Is this a situation where I have to display the video output in order for this to be called?