1

I am trying to blur the AVCaptureVideoPreviewLayer on applicationWillResignActive just as in the stock iOS Camera App.

Since rasterizing AVCaptureVideoPreviewLayer and blurring itself produces an empty frame, I approached it by keeping a CIImage from didOutputSampleBuffer and upon applicationWillResignActive, I take the CIImage -> Apply CIFilter CIGaussianBlur to it, add a UIImageView to AVCaptureVideoPreviewLayer and make that UIImageView's Image to be the UIImage version of the blur I applied the CIGaussianBlur to.

This seem to be working okay so far... However, it seem to produce quite a bit of banding, displaying quality issues on the Gaussian Blur.

enter image description here

Grabbing the Frame:

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {

            let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
            CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))

            let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer!)

            let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!)
            let width = CVPixelBufferGetWidth(imageBuffer!)
            let height = CVPixelBufferGetHeight(imageBuffer!)

            let colorSpace = CGColorSpaceCreateDeviceRGB()

            let bitmap = CGBitmapInfo(rawValue: CGBitmapInfo.ByteOrder32Little.rawValue|CGImageAlphaInfo.PremultipliedFirst.rawValue)
            let context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                bytesPerRow, colorSpace, bitmap.rawValue)
            let quartzImage = CGBitmapContextCreateImage(context!)
            CVPixelBufferUnlockBaseAddress(imageBuffer!,CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))

            self.cameraBufferImage = CIImage(CGImage: quartzImage!)
}

Applying the CIGaussianBlur:

func blurPreviewWindow() {
    let previewLayer = self.previewView.layer as? AVCaptureVideoPreviewLayer
    previewLayer?.connection.enabled = false

    let context = CIContext(options: nil)
    let inputImage = self.cameraBufferImage!.imageByApplyingOrientation(6)

    let clampFilter = CIFilter(name: "CIAffineClamp")
    clampFilter!.setDefaults()
    clampFilter!.setValue(inputImage, forKey: "inputImage")

    if let currentFilter = CIFilter(name: "CIGaussianBlur") {
        currentFilter.setValue(clampFilter!.outputImage, forKey: "inputImage")
        currentFilter.setValue(50.0, forKey: "inputRadius")

        if let output = currentFilter.outputImage {
            if let cgimg = context.createCGImage(output, fromRect: inputImage.extent) {
                let processedImage = UIImage(CGImage: cgimg)
                self.previewBlurImageView = UIImageView(frame: self.previewView.bounds)
                self.previewBlurImageView?.alpha = 0
                self.previewView.addSubview(self.previewBlurImageView!)
                self.previewBlurImageView?.image = processedImage
                self.previewBlurImageView?.contentMode = .ScaleToFill
                self.previewBlurImageView?.layer.filters = [currentFilter]

                UIView.animateWithDuration(0.2, delay: 0.0, options: [.BeginFromCurrentState], animations: {
                                                () -> Void in
                    self.previewBlurImageView?.alpha = 1
                }, completion: { _ in })
            }
        }
    }
}

May be there's a whole different approach to this of this iOS 10 era?

UPDATE:

Could this be a colorspace issue? Since the test device is iPhone 7, which has wide color?

Gizmodo
  • 3,151
  • 7
  • 45
  • 92
  • 1
    Try using a different CIfilter related to blur("CIBoxBlur", "CIDiscBlur", "CiGaussianBlur", "CIMaskedVariableBlur" "CIMedianFilter","CIMotionBlur", "CIZoomBlur",) you don't need to change any of your code .Just change the cifilter name.see that makes any difference.... – Joe Oct 07 '16 at 14:18
  • It seems that other filters do the same, tho some less than others. The most important part is, only CIGaussianBlur gives the look I am looking for. May be it's the way I am extracting the CIImage? May be there's a complete different approach to blurring AVCapturePreviewLayer? – Gizmodo Oct 07 '16 at 22:15
  • If you want you can create your own custom filter check out the following link https://github.com/FlexMonkey/Filterpedia , http://flexmonkey.blogspot.com.au/2016/04/creating-custom-variable-blur-filter-in.html?m=1 , https://developer.apple.com/library/content/documentation/GraphicsImaging/Conceptual/CoreImaging/ci_custom_filters/ci_custom_filters.html hope this help.... – Joe Oct 08 '16 at 00:36
  • Since above method is somewhat expensive, there HAS TO BE A BETTER way to blur out the live camera preview. – Gizmodo Oct 08 '16 at 00:41
  • Did you checkout the link – Joe Oct 08 '16 at 00:44
  • I am currently. Above method grabs a full 12mp image (of whatever the camera sensor produces) and tries to apply the filter. If the camera preview layer is in the size of the screen, this could be done in a better way that just a rasterized camera preview is blurred. I just can't figure out that part.. – Gizmodo Oct 08 '16 at 00:48
  • 1
    Checkout the link and you have to build your own colorKernal to achieve a new custom filter.its a complex process but spending bit of time will do the job... – Joe Oct 08 '16 at 00:50
  • Thank you. I will. There are many camera apps that are doing this easily. – Gizmodo Oct 08 '16 at 00:55
  • 1
    http://stackoverflow.com/questions/32378666/how-to-apply-filter-to-video-real-time-using-swift this solve your problem. i strongly recommend you to goto flexmokey and study about creating custom filter. unique filter stands out more then the local filters...always keep in mind...cu – Joe Oct 08 '16 at 01:08
  • Still dealing with this issue. Could this be a color profile issue? Since this banding is happening on iPhone 7. – Gizmodo Oct 15 '16 at 17:45
  • Did you try changing inputRadius.. – Joe Oct 15 '16 at 17:49
  • I did. Banding is more apparent when the image is darker.. – Gizmodo Oct 15 '16 at 17:50
  • 1
    Try autoEnhance the inputImage before you apply your filter – Joe Oct 15 '16 at 17:52
  • To the CIImage before applying the layers? Could you elaborate on that? – Gizmodo Oct 15 '16 at 17:57
  • Try the following link http://www.hangge.com/blog/cache/detail_902.html I dont thing its a good idea to use CIAffineClamp as a input filter(this filter only clamps the pixels at the edge of the transformed image, extending them outwards).So, its better to auto enhance the inputImage and pass it to blurFilter or you need study filter from CICategoryColorAdjustment & CICategoryColorEffect. i want you to consider filter related to image hue,alpha,saturation,exposure.... – Joe Oct 15 '16 at 18:13
  • i wondering the screenshot you posted is actually from the simulator or iPhone.... – Joe Oct 15 '16 at 18:17
  • iPhone. Removing the Clamp Filter still produces banding – Gizmodo Oct 15 '16 at 19:56

1 Answers1

4

It was CIContext.

Changed:

let context = CIContext(options: nil)

To:

let context = CIContext(options: [kCIContextWorkingColorSpace: CGColorSpaceCreateDeviceRGB(),
                                  kCIContextOutputColorSpace: CGColorSpaceCreateDeviceRGB(), 
                                  kCIContextUseSoftwareRenderer: false])

No more banding or artifacts!!!

Gizmodo
  • 3,151
  • 7
  • 45
  • 92
  • 1
    I'm glad that works for you, but it might still be doing more work than you need. If you just want a "UI-style" blur, have you tried putting your capture preview layer inside a `UIVisualEffectView`? – rickster Oct 27 '16 at 21:48
  • I had this all this time. UIVisualEffectView looks a little muddy sometimes, compared to Gaussian Blur. – Gizmodo Oct 27 '16 at 21:49
  • out of curiosity, do you know what the default working and output colour spaces were originally? – Rhythmic Fistman Nov 03 '16 at 02:25
  • I don't. Toggling kCIContextUseSoftwareRenderer had no effect on the blur. – Gizmodo Nov 03 '16 at 03:10