Okay so I've been using OpenCV and want to use it for IOS with Swift 2.0... I successfully implemented it and I tested it with a few functions/examples, it worked fine! But The application in thought I have is a Live Camera object detection. So using a cascade classifier I will do this, but the thing I have trouble with is setting up the CvVideoCameraDelegate protocol with my ViewController; I'm trying to use this tutrial/example to set it up.... But I'm having trouble trying to set it up in swift.... Can someone please advise the correct way to set it up?
Asked
Active
Viewed 1,138 times
1 Answers
2
The tutorial you are referencing is about how to do this in Objective-C. Doing it in Swift is a little trickier. You will need to set up an Objective-C class as the camera's delegate, and your Swift code will have to communicate with the CvVideoCamera
object via this Objective-C class and possibly other helper Objective-C classes. Please see this question/answer: Video processing with OpenCV in IOS Swift project