0

We are using AVCaptureVideoDataOutput() as photo output Instance, to capture images. But there was major reduction in the quality of the image as compared to UIImagePickerController which we were using previously. We cannot use AVCapturePhotoOutput or AVCaptureStillImageOutput for capturing the photoOutput as we need to capture every single frame of the camera for live filtering with some filters. Is there a way we can improve the quality of the images taken using AVCaptureVideoDataOutput(), and make the quality equivalent to the image taken using UIImagePickerController?

Tried by -

  • making setSampleBufferDelegate: queue: to a serial dispatch queue
  • Setting the videoSettings property to specify a custom output format, such as 'BGRA' (see Code Snippet SP16)
  • Setting the minFrameDuration property to cap the max frame rate
  • Setting the alwaysDiscardsLateVideoFrames property to NO
  • making captureSession.sessionPreset = .photo (also tried with .high, not much difference)

As a workaround we tried increasing the sharpness using both CIUnsharpMask and CISharpenLuminance but did not receive much improvement compared to the image taken using UIImagePickerController.

0 Answers0