18

I'm working with AVCaptureSession for capturing the image. Its working fine but not giving a good resolution. I compared it with the image captured by iPhone Camera and I found that the iPhone Camera Image is much better than AVCaptureSession image.

I have seen 3-4 links on stackOverflow about this but could not find any solution.

also I have tried all the presets

AVCaptureSessionPresetPhoto, 
AVCaptureSessionPresetHigh, 
AVCaptureSessionPresetMedium, 
AVCaptureSessionPresetLow, 
AVCaptureSessionPreset352x288, 
AVCaptureSessionPreset640x480, 
AVCaptureSessionPreset1280x720, 
AVCaptureSessionPreset1920x1080, 
AVCaptureSessionPresetiFrame960x540, 
AVCaptureSessionPresetiFrame1280x720, 

but still image is of lesser quality. Please let me know if my question is not clear enough or I missed something.

TheTiger
  • 13,264
  • 3
  • 57
  • 82

3 Answers3

14

When using the preset AVCaptureSessionPresetPhoto with an AVCaptureStillImageOutput, I'm able to capture images on an iPhone 4S at a resolution of 3268x2448, which is the exact same resolution that the built-in camera application yields. The same is true for the iPhone 4, Retina iPad, etc., so if you use that preset with a still image input, you will get a sample buffer back from -captureStillImageAsynchronouslyFromConnection:completionHandler: that is the native camera resolution.

In regards to photo quality, remember that the built-in camera application has the ability to capture high-dynamic-range (HDR) photos by the quick acquisition of images at different exposure levels. We do not have access to this via the standard AV Foundation APIs, so all we get is one image at a defined exposure level.

If you turn HDR off, the image quality looks identical to me. Here is a zoomed-in portion of a photo captured using an AVCaptureStillImageOutput:

AVCaptureStillImageOutput image

and here is one from the built-in photo application:

Built-in Photos image

Ignoring the slight differences in lighting due to a little shift in camera direction, the resolution and fidelity of images captured both ways appear to be the same.

I captured the first image using the SimplePhotoFilter example application from my open source GPUImage framework, replacing the default GPUImageSketchFilter with a GPUImageGammaFilter that didn't adjust the gamma any, just acted as a passthrough.

Brad Larson
  • 170,088
  • 45
  • 397
  • 571
  • Thank you for your response!! i downloaded your GPUImage framework one month ago and also have used it in my testing application and found it great. my current app supports all orientation and flash on Off button , GPUImage frame is a big project itself and i just want to use a part of it ... and unable to do this and my camera view is of 359x280 in landscape so GPUImage is hard to use for me. :-( – TheTiger Jun 11 '12 at 05:30
  • is it possible to set any other preset other than above all ..? – TheTiger Jun 11 '12 at 05:36
  • 1
    @VakulSaini - Simply look into the GPUImageStillCamera and GPUImageVideoCamera classes, where you can see how I set up the AV Foundation inputs and capture photos. This code works, so base your own implementation on this. I'm not sure why you're asking about other presets, because if you're taking a photo you'll want to use AVCaptureSessionPresetPhoto for the highest resolution capture. Anything else will be lower resolution than the native camera. – Brad Larson Jun 11 '12 at 14:33
  • Hi @BradLarson do you know how to stabilize early frames for videos as described in this question: http://stackoverflow.com/questions/34912050/avoiding-blurriness-at-start-end-of-video-even-after-using-setpreferredvideos? We're capturing video, but if the user quick taps, we pull a still image from the video rather than present full video. Unfortunately, the still image ends up being blurry more often than not. – Crashalot Jan 22 '16 at 11:33
12

Just add this line of code in your file

self.captureSession.sessionPreset = .photo

You will get a awesome resolution like the integrated camera app from apple.

BilalReffas
  • 8,132
  • 4
  • 50
  • 71
6

We had the same issue using that code reference. The photos were especially bad using the iPad front facing camera.

We fixed it by adding a 1 second delay between setting up the capture manager and calling the capture image method. It made a big enough difference to us that we were happy with the result. It appears that the camera needed some time to open the shutter and perform the auto-focus / white-balancing default options.

[NSTimer scheduledTimerWithTimeInterval:1.0 
    target:self.captureManager 
    selector:@selector(captureStillImage) 
    userInfo:nil repeats:NO];
Jojodmo
  • 23,357
  • 13
  • 65
  • 107
John Grant
  • 1,679
  • 10
  • 8
  • yeah i have tried this also, but i think i should not worry about it because the image capture by apple sample code AVCam is also has a bad quality.... only iPhone original cam captures the good pics... :-) – TheTiger Jun 09 '12 at 14:54