I'm looking for a way to create long time lapse videos on an iPhone running iOS 9, and hoping to get some pointers on how to start. Ideally I would compress 1 hour of footage into 1 minute, so the scaling factor is 60. I take one frame out of 60 and stitch them together, right?
I have a project which uses AVFoundation capture images using captureOutput:idOutputSampleBuffer:fromConnection:
However, I'm not sure if there are better approaches to creating a time lapse over several hours.
Would it make sense to take individual photos and stitch them together (activating camera every few seconds)?
Or just take frames out of CMSampleBufferRef?
Are there other APIs I can use for capturing camera images?
I'm hoping to understand which approach would result in the highest quality and battery life.
I'm looking at this question which appears to have code for stitching images, but I'm not sure if I need anything else for my project.