10

I have a requirement to take a video, convert each frame to and image and save these images to disk. I'd like to use AVAssetImageGenerator for efficiency's sake, and have code similar to the following:

The issue is that I don't know when all image generation is complete, but I need to take action once all frames are written to disk. For example:

assetGenerator.generateCGImagesAsynchronously(forTimes: frameTimes, completionHandler: { (requestedTime, image, actualTime, result, error) in
    // 1. Keep a reference to each image
    // 2. Wait until all images are generated
    // 3. Process images as a set
})

It's step 2 above that's tripping me up. I imagine I can try to count the number of times the completion handler gets called, and trigger the appropriate method when the count equals the number of frames.

But I'm wondering if there's a way to use the API to know when every frame has been processed? Maybe just something I've missed? Any guidance or advice would be appreciated.

terrafirma9
  • 362
  • 2
  • 15

1 Answers1

9

I would progressively process the images, you won't be able to fit them all in memory at once anyway. To do that you could sample the video at certain times using assetGenerator.copyCGImageAtTime.

But then you may oversample (repeating frames) or undersample (skipping frames). If you care about that, then try using AVAssetReader to read all the frames in the video:

    let reader = try! AVAssetReader(asset: asset)

    let videoTrack = asset.tracks(withMediaType: AVMediaType.video)[0]

    // read video frames as BGRA
    let trackReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings:[String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)])

    reader.add(trackReaderOutput)
    reader.startReading()

    while let sampleBuffer = trackReaderOutput.copyNextSampleBuffer() {
        if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
            // now you have images as CVImageBuffers/CVPixelBuffers
            process(imageBuffer)
        }
    }

This gives you all the frames as CVPixelBuffers. You can easily convert them to other types using code like this. If you're interested in the timestamp of the frames, call CMSampleBufferGetPresentationTimeStamp(sampleBuffer).

Brody Higby
  • 137
  • 1
  • 8
Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • Thank you, this looks like a better solution. – terrafirma9 Sep 23 '16 at 14:10
  • Hey, I tried using this code, worked fine but when i get imageBuffer I am converting it to UIImage its getting memory crash. Any idea how to fix that ? – Sharad Chauhan Apr 06 '18 at 11:09
  • Sounds like you’re retaining too many image buffers or UIImages. If that doesn’t help you fix the problem, post your code in a new question. – Rhythmic Fistman Apr 06 '18 at 11:26
  • Yes but the orientation goes to native state so we have to do something to make images rotate after suck the recording orientation which means some additional coding and searching. – Hope May 04 '18 at 07:02
  • Ah - is that something `generateCGImagesAsynchronously` does for you? I didn't know that. – Rhythmic Fistman May 04 '18 at 07:07