Goal
I am trying to apply the object detection functionality of the Breakfast Finder sample code to my app. When I add my personal model to the Breakfast Finder sample code and run it, it detects my objects and presents labels just fine.
Problem
When I attempt to add the sample code to a test app (new xcodeproj file), I can't get the live camera feed. I just get the security pop-up and a blank screen.
What I did to get the problem
- Copy over the
ViewController
andViewObjectRecognitionViewController
swift files - Link the Preview View referencing outlet to the
@IBOutlet
in theViewController
- line 17 - Add the mlmodel file (from the sample code, not my mlmodel file)
- Add the
NSCameraUsageDescription
to the Info file (with a value).
On another attempt, I tried just copying all the files (swift, plist, mlmodel, etc.) over from the sample code and troubleshooting connection issues, but got the same problem.
Final Thoughts
Why does the Breakfast Finder sample code result in a blank screen after adding it to a new xcodeproj file? I have never dealt with live camera feed so I might have overlooked a simple problem. I have an iPhone XR running on ios15. You can find a link to the sample code here or google Breakfast Finder.