12

is possible to apply filter to AVLayer and add it to view as addSublayer? I want to change colors and add some noise to video from camera using Swift and I don't know how.

I thought, that is possible to add filterLayer and previewLayer like this:

self.view.layer.addSublayer(previewLayer)
self.view.layer.addSublayer(filterLayer)

and this can maybe create video with my custom filter, but I think, that is possible to do that more effectively usign AVComposition

So what I need to know:

  1. What is simplest way to apply filter to camera video output realtime?
  2. Is possible to merge AVCaptureVideoPreviewLayer and CALayer?

Thanks for every suggestion..

David Sýkora
  • 574
  • 1
  • 3
  • 16
  • You haven't provided much info about what you're doing, but it's possible to edit the video feed live using a `GLKView` instead of a `AVCapturePreviewLayer` and applying the filter to each frame in `captureOutput(capptureOutput: didOutputToSampleBuffer: fromConnection connection:)`. – Lyndsey Scott Sep 03 '15 at 14:48
  • Thanks! GLKView look better to me. :) Simply: I need to apply filter to video frames live and have a choice to save it to a file. I'am doing some kind of video-camera with filters. – David Sýkora Sep 03 '15 at 15:08

2 Answers2

24

There's another alternative, use an AVCaptureSession to create instances of CIImage to which you can apply CIFilters (of which there are loads, from blurs to color correction to VFX).

Here's an example using the ComicBook effect. In a nutshell, create an AVCaptureSession:

let captureSession = AVCaptureSession()
captureSession.sessionPreset = AVCaptureSessionPresetPhoto

Create an AVCaptureDevice to represent the camera, here I'm setting the back camera:

let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)

Then create a concrete implementation of the device and attach it to the session. In Swift 2, instantiating AVCaptureDeviceInput can throw an error, so we need to catch that:

 do
{
    let input = try AVCaptureDeviceInput(device: backCamera)

    captureSession.addInput(input)
}
catch
{
    print("can't access camera")
    return
}

Now, here's a little 'gotcha': although we don't actually use an AVCaptureVideoPreviewLayer but it's required to get the sample delegate working, so we create one of those:

// although we don't use this, it's required to get captureOutput invoked
let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)

view.layer.addSublayer(previewLayer)

Next, we create a video output, AVCaptureVideoDataOutput which we'll use to access the video feed:

let videoOutput = AVCaptureVideoDataOutput()

Ensuring that self implements AVCaptureVideoDataOutputSampleBufferDelegate, we can set the sample buffer delegate on the video output:

 videoOutput.setSampleBufferDelegate(self, 
    queue: dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL))

The video output is then attached to the capture session:

 captureSession.addOutput(videoOutput)

...and, finally, we start the capture session:

captureSession.startRunning()

Because we've set the delegate, captureOutput will be invoked with each frame capture. captureOutput is passed a sample buffer of type CMSampleBuffer and it just takes two lines of code to convert that data to a CIImage for Core Image to handle:

let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
let cameraImage = CIImage(CVPixelBuffer: pixelBuffer!)

...and that image data is passed to our Comic Book effect which, in turn, is used to populate an image view:

let comicEffect = CIFilter(name: "CIComicEffect")

comicEffect!.setValue(cameraImage, forKey: kCIInputImageKey)

let filteredImage = UIImage(CIImage: comicEffect!.valueForKey(kCIOutputImageKey) as! CIImage!)

dispatch_async(dispatch_get_main_queue())
{
    self.imageView.image = filteredImage
}

I have the source code for this project available in my GitHub repo here.

Flex Monkey
  • 3,583
  • 17
  • 19
  • 1
    Thanks a lot! Your articles and repo really helps me. But I got error, when I try to create dispatch_queue `Cannot invoke 'setSampleBufferDelegate' with an argument list of type '(ViewController, queue: dispatch_queue_attr_t!)'` – David Sýkora Sep 04 '15 at 13:04
  • Thanks for the kind words! Take a look at this class: https://github.com/FlexMonkey/ParticleCam/blob/master/ParticleCam/ViewController.swift – Flex Monkey Sep 04 '15 at 13:31
  • 1
    Hah, my fault... I was using iOS 8.4.1 So problem solved. Again, thank you. – David Sýkora Sep 04 '15 at 13:47
  • One more thing: when I try it on iPhone 4S FPS dropped to something about 5-10, even though I have set `AVCaptureSessionPresetHigh`. Is this best way to do this task on iOS 8? What about create two layers and merge it to one? Would it be possible? Thanks a lot!! – David Sýkora Sep 04 '15 at 16:31
  • Hi @SimonGladman thanks for the detailed answer. Could you explain the implications of using frame rates on exported videos that differ from the source videos? Question: http://stackoverflow.com/questions/34937008/exporting-videos-on-ios-understanding-and-setting-frame-duration-property – Crashalot Jan 22 '16 at 00:44
  • Sorry - I don't know the answer to that :( – Flex Monkey Jan 22 '16 at 11:03
  • Hint: This won't work if you want to also have a fileoutput. You would have to write the file yourself from the buffer. Correct me if i'm wrong. – dy_ Nov 04 '16 at 17:20
  • 1
    can i save videos with this effects ? – Kodr.F Dec 13 '16 at 12:09
  • 2
    @NinjaDevelopers you could, but you'd have to write the buffer output to a file, I think you do this with `AVAssetWriterInputPixelBufferAdaptor`, see http://stackoverflow.com/questions/3741323/how-do-i-export-uiimage-array-as-a-movie – Scriptable Dec 13 '16 at 16:58
  • @Scriptable thank you , i am new in swift ,i hope i can make it ;) – Kodr.F Dec 14 '16 at 07:45
  • well, to convert UIImage back to CMSampleBuffer isn't real time would be here – user924 Mar 02 '18 at 13:06
  • or does it directly changes buffer from CVPixelBuffer? Is there any filter to add text using CIFilter? – user924 Mar 02 '18 at 13:07
  • Hi All.. Is it possible to apply CIFilter directly to the Video, not to their image frames? As I am facing issue with source video audio (Audio sound is missing after applying filter effect). Please advise! – Anand Gautam Mar 14 '18 at 13:30
2

If you're using an AVPlayerViewController, you can set the compositingFilter property of the view's layer:

  playerController.view.layer.compositingFilter = "multiplyBlendMode"

See here for the compositing filter options you can use. e.g. "multiplyBlendMode", "screenBlendMode", etc.

Example of doing this in a UIViewController:

class ViewController : UIViewController{
  override func viewDidLoad() {
    //load a movie called my_movie.mp4 that's in your xcode project
    let path = Bundle.main.path(forResource: "my_movie", ofType:"mp4")
    let player = AVPlayer(url: URL(fileURLWithPath: path!))

    //make a movie player and set the filter
    let playerController = AVPlayerViewController()
    playerController.player = player
    playerController.view.layer.compositingFilter = "multiplyBlendMode"

    //add the player view controller to this view controller
    self.addChild(playerController)
    view.addSubview(playerController.view)
    playerController.didMove(toParent: self)

    //play the movie
    player.play()
  }
}

For let path = Bundle.main.path(forResource: "my_movie", ofType:"mp4"), make sure you add the .mp4 file to Build Phases > Copy Bundle Resources in your Xcode project. Or check the 'add to target' boxes when you import the file.

spnkr
  • 952
  • 9
  • 18