3

I'm looking to make a native iPhone iOS application in Swift 3/4 which uses the live preview of the back facing camera and allows users to apply filters like in the built in Camera app. The idea was for me to create my own filters by adjusting Hue/ RGB/ Brightness levels etc. Eventually I want to create a HUE slider which allows users to filter for specific colours in the live preview.

All of the answers I came across for a similar problem were posted > 2 years ago and I'm not even sure if they provide me with the relevant, up-to-date solution I am looking for.

I'm not looking to take a photo and then apply a filter afterwards. I'm looking for the same functionality as the native Camera app. To apply the filter live as you are seeing the camera preview.

How can I create this functionality? Can this be achieved using AVFoundation? AVKit? Can this functionality be achieved with ARKit perhaps?

nopassport1
  • 1,821
  • 1
  • 25
  • 53
  • In AVFoundation have you tried intercepting the output buffer using a AVCaptureVideoDataOutputSampleBufferDelegate and then using the CIImage classes. I haven't attempted this myself but that is where I would start. This may help https://stackoverflow.com/questions/32378666/how-to-apply-filter-to-video-real-time-using-swift – adamfowlerphoto Oct 31 '17 at 09:18
  • @Spads Thanks for the comment. As a matter of fact I was just in the process of following the same example ^. I've created an application using AVFoundation to display the back facing camera output, will be looking at applying the filters next but I'm unsure of the efficiency – nopassport1 Oct 31 '17 at 16:14
  • @Boris : How did you eventually do this ? What was your approach ? – iOS_Passion Sep 15 '18 at 16:53

1 Answers1

4

Yes, you can apply image filters to the camera feed by capturing video with the AVFoundation Capture system and using your own renderer to process and display video frames.

Apple has a sample code project called AVCamPhotoFilter that does just this, and shows multiple approaches to the process, using Metal or Core Image. The key points are to:

  1. Use AVCaptureVideoDataOutput to get live video frames.
  2. Use CVMetalTextureCache or CVPixelBufferPool to get the video pixel buffers accessible to your favorite rendering technology.
  3. Draw the textures using Metal (or OpenGL or whatever) with a Metal shader or Core Image filter to do pixel processing on the CPU during your render pass.

BTW, ARKit is overkill if all you want to do is apply image processing to the camera feed. ARKit is for when you want to know about the camera’s relationship to real-world space, primarily for purposes like drawing 3D content that appears to inhabit the real world.

rickster
  • 124,678
  • 26
  • 272
  • 326
  • Do you have any example which does exactly that ? – iOS_Passion Sep 16 '18 at 14:06
  • Yes — the Apple sample code AVCamPhotoFilter that’s linked in the answer. – rickster Sep 16 '18 at 15:02
  • Thank you. I will take a look at AVCamPhotoFilter. Actually I want to achieve something like this : https://stackoverflow.com/questions/52231662/change-color-of-real-time-object-in-live-camera-feed-in-iphone. I will really appreciate if you have any suggestions for me. – iOS_Passion Sep 16 '18 at 15:23