1

What i want to do is to capture iPhone camera frames with AVCaptureSession in an Xcode/swift3 project.

What i did is to instanciante AVCaptureSession, AVCapturePhotoOutput and stuff objects. It works great, the didOutputSampleBuffer delegate is called for each frame. What i want to do now is to do a simple task on each frame: I just want to make a threshold. It is very simple, i just have to iterate one time all my frame pixels.

I have read some tutorials that explained how i can convert my raw pointers to an UIImage and to display the result in an UIIMageView.

But this is very slow. I do not understand why because there is nothing in my task: Just a threshold and some conversion image stuff.

Do you know if i made a mistake or if there is a better way to do this ?

Thanks

class MyClass: AVCaptureVideoDataOutputSampleBufferDelegate
{
    func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)
    {

        connection.videoOrientation = .portrait
        connection.isVideoMirrored = true

        let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!


        CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))

        let width = CVPixelBufferGetWidth( pixelBuffer )
        let height = CVPixelBufferGetHeight( pixelBuffer )
        let bytesPerRow = CVPixelBufferGetBytesPerRow( pixelBuffer )

        let black = PixelData(a:255,r:0,g:0,b:0)
        var pixelData = [PixelData](repeating: black, count: Int(width * height))

        if let baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer)
        {
            let buf = baseAddress.assumingMemoryBound(to: UInt8.self)

            var cpt = 0

            for y in 0..<height
            {
                for x in 0..<width
                {
                    let idx = x + y * width

                    if buf[ bytesPerRow*y + x*4 + 2] > 150 && buf[ bytesPerRow*y + x*4 + 1] < 150 && buf[ bytesPerRow*y + x*4 + 0] < 150
                    {
                         pixelData[ idx ].r = 0
                         pixelData[ idx ].g = 255
                         pixelData[ idx ].b = 0
                         cpt = cpt + 1
                    }
                    else
                    {
                        pixelData[ idx ].r = 0
                        pixelData[ idx ].g = 0
                        pixelData[ idx ].b = 0
                    }
                }
            }

        }


        var data = pixelData
        let providerRef = CGDataProvider(
            data: NSData(bytes: &data, length: data.count * MemoryLayout<PixelData>.size)
        )

        let cgim = CGImage(
            width: width,
            height: height,
            bitsPerComponent: 8,
            bitsPerPixel: 32, 
            bytesPerRow: width * (MemoryLayout<PixelData>.size),
            space: CGColorSpaceCreateDeviceRGB(),
            bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedFirst.rawValue),
            provider: providerRef!,
            decode: nil,
            shouldInterpolate: true,
            intent: .defaultIntent
        )
        let image = UIImage(cgImage: cgim!)

        DispatchQueue.main.async { [unowned self] in
            self.myimageview.image = image
        }

        CVPixelBufferUnlockBaseAddress( pixelBuffer, CVPixelBufferLockFlags(rawValue: 0) )

    }
}
Bob5421
  • 7,757
  • 14
  • 81
  • 175
  • Consider using CoreImage, e.g. http://stackoverflow.com/questions/29692275/how-to-output-a-cifilter-to-a-camera-view. Iterating over tens of thousands of pixels, one by one on a single CPU thread, will be slow. You need to process a frame in just a few milliseconds, use the GPU. – CSmith Dec 19 '16 at 21:36
  • So you think i should write my own CIFilter ? – Bob5421 Dec 20 '16 at 07:06
  • Yes, while it might seem daunting your filter will actually be simple, and performance will be awesome. Check the WWDC sessions from a couple years back on Whats New with CoreImage. – CSmith Dec 20 '16 at 13:57
  • do you think that opencv2 for iOS will work with GPU too ? I am wondering if i should use opencv2 – Bob5421 Dec 20 '16 at 15:16

0 Answers0