I am using AVFoundation captureOutput didOutputSampleBuffer to extract an image then to be used for a filter.
self.bufferFrameQueue = DispatchQueue(label: "bufferFrame queue", qos: DispatchQoS.background, attributes: [], autoreleaseFrequency: .inherit)
self.videoDataOutput = AVCaptureVideoDataOutput()
if self.session.canAddOutput(self.videoDataOutput) {
self.session.addOutput(videoDataOutput)
self.videoDataOutput!.alwaysDiscardsLateVideoFrames = true
self.videoDataOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)]
self.videoDataOutput!.setSampleBufferDelegate(self, queue: self.bufferFrameQueue)
}
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
connection.videoOrientation = .portrait
let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
DispatchQueue.main.async {
self.cameraBufferImage = ciImage
}
}
Above just updates self.cameraBufferImage anytime there's a new output sample buffer.
Then, when a filter button is pressed, I use self.cameraBufferImage as this:
func filterButtonPressed() {
if var inputImage = self.cameraBufferImage {
if let currentFilter = CIFilter(name: "CISepiaTone") {
currentFilter.setValue(inputImage, forKey: "inputImage")
currentFilter.setValue(1, forKey: "inputIntensity")
if let output = currentFilter.outputImage {
if let cgimg = self.context.createCGImage(output, from: inputImage.extent) {
self.filterImageLayer = CALayer()
self.filterImageLayer!.frame = self.imagePreviewView.bounds
self.filterImageLayer!.contents = cgimg
self.filterImageLayer!.contentsGravity = kCAGravityResizeAspectFill
self.imagePreviewView.layer.addSublayer(self.filterImageLayer!)
}
}
}
}
}
When above method is invoked, it grabs the 'current' self.cameraBufferImage and use it to apply the filter. This works fine in normal exposure duration times (below 1/15 seconds or so...)
Issue
When exposure duration is slow, i.e. 1/3 seconds, it takes a awhile (about 1/3 seconds) to apply the filter. This delay is only present upon the first time after launch. If done again, there is no delay at all.
Thoughts
I understand that if exposure duration is 1/3 seconds, didOutputSampleBuffer only updates every 1/3 seconds. However, why is that initial delay? Shouldn't it just grab whatever self.cameraBufferImage available at that exact time, instead of waiting?
- Queue issue?
- CMSampleBuffer retain issue? (Although on Swift 3, there is no CFRetain)
Update
Delegates receive this message whenever the output captures and outputs a new video frame, decoding or re-encoding it as specified by its videoSettings property. Delegates can use the provided video frame in conjunction with other APIs for further processing.
This method is called on the dispatch queue specified by the output’s sampleBufferCallbackQueue property. It is called periodically, so it must be efficient to prevent capture performance problems, including dropped frames.
If you need to reference the CMSampleBuffer object outside of the scope of this method, you must CFRetain it and then CFRelease it when you are finished with it.
To maintain optimal performance, some sample buffers directly reference pools of memory that may need to be reused by the device system and other capture inputs. This is frequently the case for uncompressed device native capture where memory blocks are copied as little as possible. If multiple sample buffers reference such pools of memory for too long, inputs will no longer be able to copy new samples into memory and those samples will be dropped.
If your application is causing samples to be dropped by retaining the provided CMSampleBuffer objects for too long, but it needs access to the sample data for a long period of time, consider copying the data into a new buffer and then releasing the sample buffer (if it was previously retained) so that the memory it references can be reused.