0

In the design of my app, I would like change the frame around the live camera feed to be the exact color of the middle pixel in the live camera. For the purposes of my app, this is important and not just a trivial color scheme.

I have code that goes through the bytes of a UIImage or even UIView and finds the pixel color of a particular coordinate. However, this does not appear to work on AVCaptureSession or AVCaptureVideoPreviewLayer.

The best workaround I have is running a queue in the background that every 0.1 seconds essentially draws a saved image offscreen and finds the pixel color that way. I know this isn't the best way to do it! And was wondering if anyone could point me towards resources that would be useful.

user7299543
  • 121
  • 2
  • 9

1 Answers1

0

You can try to get the camera frames from AVFoundation and analyse the frame image buffer to get to know the colour.

Assume you already have AVCaptureVideoDataOutput setup then you can simply use the below AVCaptureVideoDataOutputSampleBufferDelegate method to get the camera frames.

optional func captureOutput(_ captureOutput: AVCaptureOutput!, 
      didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, 
                       from connection: AVCaptureConnection!)

Note that this delegate will only give us the image buffer a.k.a CMSampleBuffer. So we need to convert this buffer into a UIImage and find the pixel colour in your case.

Below are the some SO answers which can help you.

Bluewings
  • 3,438
  • 3
  • 18
  • 31