I'm starting to develop an iOS app and this is my first SO post. I'm trying to implement a UI view which can show the preview video of the rear camera and process the captured frames. My preview layer works perfectly and I can see the picture display in my UI view. However, the captureOutput function is never called.
I have searched online for silimar issues and solutions for a while and tried to tweak different things including the output, connection, and dispatch queue settings, but none has worked. Can anyone help me out or share some insights and directions? Thanks a lot in advance!
Here is my code, I'm using Xcode 11 beta
with iOS 10
as build target.
class ThreeDScanningViewController: UIViewController,
AVCaptureVideoDataOutputSampleBufferDelegate {
@IBOutlet weak var imageView: UIImageView!
var session : AVCaptureSession!
var device : AVCaptureDevice!
var output : AVCaptureVideoDataOutput!
var previewLayer : AVCaptureVideoPreviewLayer!
override func viewDidLoad() {
super.viewDidLoad()
//NotificationCenter.default.addObserver(self, selector: #selector(self.startedNotif), name: NSNotification.name.CaptureSessionDidStartRunningNotification, object: nil)
func initCamera() -> Bool {
session = AVCaptureSession()
session.sessionPreset = AVCaptureSession.Preset.medium
let devices = AVCaptureDevice.devices()
for d in devices {
if ((d as AnyObject).position == AVCaptureDevice.Position.back) {
device = d as! AVCaptureDevice
}
}
if device == nil {
return false
}
do {
// Set up the input
let input : AVCaptureDeviceInput!
try input = AVCaptureDeviceInput(device: device)
if session.canAddInput(input) {
session.addInput(input)
} else {
return false
}
// Set up the device
try device.lockForConfiguration()
device.activeVideoMinFrameDuration = CMTimeMake(1, 15)
device.unlockForConfiguration()
// Set up the preview layer
previewLayer = AVCaptureVideoPreviewLayer(session: session)
previewLayer.frame = imageView.bounds
imageView.layer.addSublayer(previewLayer)
// Set up the output
output = AVCaptureVideoDataOutput()
output.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString) as String: kCVPixelFormatType_32BGRA]
let queue = DispatchQueue(label: "myqueue")
output!.setSampleBufferDelegate(self, queue: queue)
output.alwaysDiscardsLateVideoFrames = true
if session.canAddOutput(output) {
session.addOutput(output)
} else {
return false
}
for connection in output.connections {
if let conn = connection as? AVCaptureConnection {
if conn.isVideoOrientationSupported {
conn.videoOrientation = AVCaptureVideoOrientation.portrait
}
}
}
session.startRunning()
} catch let error as NSError {
print(error)
return false
}
return true
}
func captureOutput (captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
print("captureOutput!\n");
DispatchQueue.main.async(execute: {
// Do stuff
})
}
}
Here are some links I've looked into, none is relevant to solve my issue: