0

I have found great use of the RosyWriter example that Apple created, that allows you to capture video going through a GLSL layer from this link.

I want to extend it a bit by letting me capture not just videos, but also photos as well. Using the same capture session, same video settings, same resolution etc. (Essentially just capturing a single frame of video into a image)

It should be straight forward, but i can't seem to find where I need to grab the buffer from and save to the Photolibrary.

From what i understand i can use the delegate:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection "

But I have not had any success doing it. Can anyone point me in the right direction?

Nirmalsinh Rathod
  • 5,079
  • 4
  • 26
  • 56
Malu05
  • 119
  • 2
  • 3
  • 10
  • What's stopping you from pointing yourself the the right direction by running a search? Stack Overflow isn't a run-a-search-for-me service web site. – El Tomato Jul 29 '17 at 22:05
  • I have spend quite a few hours in here trying to find an answer. The closest was this: https://stackoverflow.com/questions/22928350/how-to-programmatically-take-photos-while-recording-video but the RosyWriter does not operate on the camera output but rather the buffer that is returned from the GLSL shader. And that is what seem to driving me off-course here. – Malu05 Jul 29 '17 at 22:13

1 Answers1

0

I found a solution.

Rather than using the sampleBuffer that contained the unmodified sample from the camera i had to use the renderedPixelBuffer

The issue was that while the sampleBuffer is a CMSampleBufferRef, the renderedPixelBuffer is a CVPixelBufferRef

Using CMSampleBufferCreateForImageBuffer i converted got a Samplebuffer that i can use to save as a image.

Malu05
  • 119
  • 2
  • 3
  • 10