In the design of my app, I would like change the frame around the live camera feed to be the exact color of the middle pixel in the live camera. For the purposes of my app, this is important and not just a trivial color scheme.
I have code that goes through the bytes of a UIImage or even UIView and finds the pixel color of a particular coordinate. However, this does not appear to work on AVCaptureSession or AVCaptureVideoPreviewLayer.
The best workaround I have is running a queue in the background that every 0.1 seconds essentially draws a saved image offscreen and finds the pixel color that way. I know this isn't the best way to do it! And was wondering if anyone could point me towards resources that would be useful.