14

I'm looking for a way to retrieve the individual frames of a video using iOS API. I tried using AVAssetImageGenerator but it seems to only provide frame to the nearest second which is a bit too rough for my usage.

From what I understand of the documentation, a pipeline of AVAssetReader, AVAssetReaderOutput and CMSampleBufferGetImageBuffer I should be able to do something but I'm stuck with a CVImageBufferRef. With this I'm looking for a way to get a CGImageRef or a UIImage but haven't found it.

Real-time is not needed and the more I can stick to provided API the better.

Thanks a lot!

Edit: Based on this site: http://www.7twenty7.com/blog/2010/11/video-processing-with-av-foundation and this question: how to convert a CVImageBufferRef to UIImage I'm nearing on a solution. Problem, the AVAssetReader stops reading after the first copyNextSampleBuffer without giving me anything (the sampleBuffer is NULL).

The video is readable by MPMoviePlayerController. I don't understand what's wrong.

Community
  • 1
  • 1
hlidotbe
  • 888
  • 2
  • 9
  • 17

3 Answers3

1

AVAssetImageGenerator has very loose default tolerances for the exact frame time that is grabbed. It has two properties that determine the tolerance: requestedTimeToleranceBefore and requestedTimeToleranceAfter. These tolerances default to kCMTimePositiveInfinity, so if you want exact times, set them to kCMTimeZero to get exact frames.

(It may take longer to grab the exact frames than approximate frames, but you state that realtime is not an issue.)

brainjam
  • 18,863
  • 8
  • 57
  • 82
1

The two links above actually answer my question and the empty copyNextBufferSample is an issue with iOS SDK 5.0b3, it works on the device.

hlidotbe
  • 888
  • 2
  • 9
  • 17
  • 1
    How did you retrieve all frames? I tried the steps mentioned in 7twenty7. In the method **readNextMovieFrame**, I added the `If` condition inside a `While` Loop which breaks only when `movieReader.status == AVAssetReaderStatusCompleted`. I then retrieve the Image as mentioned in the [POST](http://stackoverflow.com/questions/3152259/how-to-convert-a-cvimagebufferref-to-uiimage) and added to an array.. But strangely the app crashes even for a video of 6 secs. I know I am doing something foolishly wrong. Can you please help me with this.. – Roshit Jul 19 '12 at 13:44
  • You definitively shouldn't keep frames in an array or you'll blow the memory very quickly (which is probably what happens to you). A 720p frame is about 3.5M uncompressed. At 30fps, make the count :) – hlidotbe Jul 19 '12 at 17:47
  • Thanks for the quick response. Any suggestions as to what I should be doing? – Roshit Jul 19 '12 at 20:54
  • Well, it depends on what you are trying to do. – hlidotbe Jul 20 '12 at 18:10
  • Once the video is taken, I want to get the individual frames of the video. The whole set of frames. – Roshit Jul 20 '12 at 20:49
  • To do what? Because you'll never be able to hold all the frames of a video (longer than a few sec) in memory. You will need to do something with them and drop them as soon as possible. For reference I needed this for this project: https://github.com/epicagency/cinegraphr and in every parts of the app I have to work with intermediate movie clips to avoid keeping too much in memory. – hlidotbe Jul 21 '12 at 07:58
  • How would you handle the app going into the background while you are processing a frame? How do you resume where you left off once the app returns to active or foreground state? – Michael Nguyen Mar 26 '14 at 10:53
0

Use AVReaderWriter. Though its an OS X Apple sample code, AVFoundation is available on both platforms with little changes.

Community
  • 1
  • 1
Bobjt
  • 4,040
  • 1
  • 29
  • 29