0

I have seen many tutorials that show how to place the live camera image on the screen, but I want to modify the images with a CIImage. I have taken a look at this, but couldn't find out if that would work. I am assuming that to do this, I need to take every frame, modify it, and then put it on the screen. I would like something similar to this, but in Swift.
My questions are:

  • How to set up a function that is called when a new captured is found by the camera
  • How to modify the image effectively. (I already know how to use CIImage, but it is very laggy)
Community
  • 1
  • 1
Bennett
  • 1,007
  • 4
  • 15
  • 29
  • 1
    To handle more camera functions, you should use AVFoundation to show camera view. This camera session will return each captured frame through delegate function `captureOutput(_:didOutputSampleBuffer:from:)`. There's already an nice sample from Apple. https://developer.apple.com/library/ios/qa/qa1702/_index.html – Tony Nguyen Aug 02 '16 at 03:39
  • @PhucNguyen I did try to convert that code a while ago, but it did not work. Found a good question [here](http://stackoverflow.com/questions/16475737/convert-uiimage-to-cmsamplebufferref#16487926) though. Going to check it out. – Bennett Aug 03 '16 at 01:02
  • The link in the first comment leads to a 404 page. – Manu Mateos Dec 07 '17 at 16:45

0 Answers0