4

I have pretty much the same question as this one below:

Switching AVCaptureSession preset when capturing a photo

The issue however is that the (self) answer doesn't help me one bit. I am wondering if someone has a clue as to how to do this.

I am capture video frames so I can process them and do something with them. For this, I am using the AVCaptureSessionPrese640x480 as I need all the frame rate I can get while getting a decent frame for computation. Now, when the user wants to capture the photo, I want him to be able to capture the photo in the highest resolution possible. For this, I am not in the favor of:

[captureSession beginConfiguration];
captureSession.sessionPreset = <some_preset>
[captureSession commitConfiguration];

I have tried this and I am getting an error in some cases; but still there is a definite lag on the screen for some reason and I don't trust this anymore.

Is there a definite way of getting a high resolution image from the camera while capturing video output in a very low resolution (for e.g. 640x480)?

Community
  • 1
  • 1
p0lAris
  • 4,750
  • 8
  • 45
  • 80
  • You might want to try putting some work to a separate thread. It is possible that commitConfiguration will pause the thread due to the device synchronisation, if possible put this into a separate thread (might work or not). In any case you should expect after calling this method your next frame could still be in low resolution so you need to check the sample on every frame. Once you receive a high-res sample you should again use a separate thread to generate the image or whatever you are doing and also probably lock the "take picture" button/trigger. – Matic Oblak Jul 15 '14 at 14:22
  • Thanks for the suggestion but that would be terrible for the app. Configuration also re-focuses so I could never get the user to focus and click. I was wondering if there was a standard way of doing things as a lot of people might be needing this. – p0lAris Jul 15 '14 at 14:46
  • I believe you have come to a dead end at this point (I could be wrong though). Maybe if possible you could simply be capturing a high resolution samples and then downsample them yourself before feeding it to the video input. To gain some performance you could try to put this onto the GPU but there is no guarantee this will gain any performance at all. I hope you find a good solution and post it here. – Matic Oblak Jul 16 '14 at 06:50
  • Matic, I asked this question on `devforums` and they said its impossible. Further, this is exactly what I am doing right now; downsampling the buffer but the resize is the most costly operation. Having to resize a smaller image to a much smaller one for processing is better than resizing a large one. Maybe I could offload that particular task to the GPU but as you said, it doesn't guarantee any extra performance. But yes, if I have something off the sleeve, I'll surely post it here. – p0lAris Jul 16 '14 at 07:05
  • 1
    You know you could use some tricks for downsampling as well: If the source sample is much larger then down-sampled you could try to remove every odd row and column from the sample and then down-sample it to a proper size (just a thought, I never tried it). This should be quite a performance gain but its hard to say what the result would look like. Anyway if you do end up with using the GPU you should look at this answer http://stackoverflow.com/questions/9550297/faster-alternative-to-glreadpixels-in-iphone-opengl-es-2-0/9704392#9704392 it might actually be your solution. – Matic Oblak Jul 16 '14 at 07:15
  • Yes, sounds like a good idea and that particular answer might be of big help. Thanks a lot. I'll look further into this when I am back on it. Cheers! – p0lAris Jul 16 '14 at 10:40
  • How'd things turn out? – aleclarson Oct 16 '14 at 04:12

2 Answers2

4

You can add AVCapturePhotoOutput to the AVCaptureSession object and set its highResolutionCaptureEnabled property to YES. During capture you can use AVCapturePhotoOutput's capturePhotoWithSettings message to capture the image by passing it AVCapturePhotoSettings object and AVCapturePhotoCaptureDelegate. By using the AVCapturePhotoSettings object, you can further modify the capture properties like flashMode, autoStillImageStabilizationEnabled and highResolutionPhotoEnabled. The usage is shown in AVCamManual example.

Gaurav Raj
  • 699
  • 6
  • 21
2

For iPhone6 and iPhone6+, you can use the new API [AVCaptureStillImageOutput setHighResolutionStillImageOutputEnabled:YES] When this flag has been turned on (it is off by default), you can capture the high res still image while using a low resolution capture session.

Ji Fang
  • 3,288
  • 1
  • 21
  • 18