3

I would like to get all frames of a Video in iOS6 into NSArray. I use this code:

-(void) getAllImagesFromVideo
{
   imagesArray = [[NSMutableArray alloc] init];
   times = [[NSMutableArray alloc] init];

   for (Float64 i = 0; i < 15; i += 0.033) // For 25 fps in 15 sec of Video
   {
       CMTime time = CMTimeMakeWithSeconds(i, 60);
      [times addObject:[NSValue valueWithCMTime:time]];
   } 

   [imageGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {

      if (result == AVAssetImageGeneratorSucceeded)
      {
         UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];

         [imagesArray addObject:generatedImage];
     }
   }];
}

On iPad simulator the delay is 90~100 seconds, on iPad device, recieved Memory Warnings and finally crash.

Any idea, solution? Using another more low-level Framework/Library? C++? Is very important for me! Help me! :)

Thanks!!!

Girish
  • 4,692
  • 4
  • 35
  • 55
Javi Campaña
  • 1,481
  • 1
  • 22
  • 30

2 Answers2

12

You need to:

  1. use CGImageRelease as @zimmryan mentioned
  2. use a block of @autoreleasepool in your loop
  3. do not store the images in the memory, store them in your Document Directory

This is how I do it: (notice I use the sync version for extracting images, it shouldn't matter if you choose the async version)

AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.requestedTimeToleranceAfter =  kCMTimeZero;
generator.requestedTimeToleranceBefore =  kCMTimeZero;
for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) *  FPS ; i++){
  @autoreleasepool {
    CMTime time = CMTimeMake(i, FPS);
    NSError *err;
    CMTime actualTime;
    CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
    UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
    [self saveImage: generatedImage atTime:actualTime]; // Saves the image on document directory and not memory
    CGImageRelease(image);
  }
}

EDIT:

you should always consider using @autoreleasepool when creating alot of temporary objects (see https://developer.apple.com/library/mac/documentation/cocoa/conceptual/memorymgmt/articles/mmAutoreleasePools.html)

ekeren
  • 3,408
  • 3
  • 35
  • 55
5

Sounds like you are running into memory issues from 375 images. Try this instead, it may provide better memory management.

-(void) getAllImagesFromVideo
{
   imagesArray = [[NSMutableArray alloc] initWithCapacity:375];
   times = [[NSMutableArray alloc] initWithCapacity:375];

   for (Float64 i = 0; i < 15; i += 0.033) // For 25 fps in 15 sec of Video
   {
       [times addObject:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(i, 60)]];
   } 

   [imageGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {
       if (result == AVAssetImageGeneratorSucceeded)
       {
           [imagesArray addObject:[UIImage imageWithCGImage:image]];
           CGImageRelease(image);
       }
   }];
}
zimmryan
  • 1,099
  • 10
  • 19
  • Thanks, but still crash, any idea? – Javi Campaña Jul 03 '13 at 21:48
  • Think, that save more UIImages in to NSArray uses a big quantity of memory. – Javi Campaña Jul 04 '13 at 06:18
  • When you look at the crash logs does it tell you that the OS killed your app for using to much memory? If so look at saving the images at a lower res, using jpeg compression, and / or writing them to disc asynchronously instead of storing them in memory. – zimmryan Jul 04 '13 at 18:11
  • You can change the output size by setting imageGenerator.maximumSize = CGSizeMake(320, 180) or any other size instead of full resolution. – zimmryan Jul 04 '13 at 18:17