22

Sorry for my english) Looking for information about read frames from a video with iPhone i found this project, http://www.codza.com/extracting-frames-from-movies-on-iphone/comment-page-1#comment-1116, but i also read somewhere that you can use AVFoundation to capture frames from a video for better performance..

But i can't find information of how i can do that...

Some idea?

Thanks for reading

Avt
  • 16,927
  • 4
  • 52
  • 72
matiasfha
  • 1,270
  • 4
  • 23
  • 42

5 Answers5

37

You're talking about using the calls for generating what Apple calls thumbnail images from videos at specific times.

For an MPMoviePlayerController (what iOS uses to hold a video from a file or other source), there are two commands to do this. The first one generates a single thumbnail (image) from a movie at a specific point in time, and the second one generates a set of thumbnails for a time range.

This example gets an image at 10 seconds into a movie clip, myMovie.mp4:

MPMoviePlayerController *movie = [[MPMoviePlayerController alloc]
        initWithContentURL:[NSURL URLWithString:@"myMovie.mp4"]];
UIImage *singleFrameImage = [movie thumbnailImageAtTime:10 
        timeOption:MPMovieTimeOptionExact];

Note that this performs synchronously - i.e. the user will be forced to wait while you get the screenshot.

The other option is to get a series of images from a movie, from an array of times:

MPMoviePlayerController *movie = [[MPMoviePlayerController alloc]
        initWithContentURL [NSURL URLWithString:@"myMovie.mp4"]];
NSNumber time1 = 10;
NSNumber time2 = 11;
NSNumber time3 = 12;
NSArray *times = [NSArray arrayWithObjects:time1,time2,time3,nil];
[movie requestThumbnailImagesAtTimes:times timeOption:MPMovieTimeOptionExact];

This second way will trigger a notification of type MPMoviePlayerThumbnailImageRequestDidFinishNotification each time a new image is generated. You can set up an observer to monitor this and process the image - I'll leave you to work that bit out on your own!

brainjam
  • 18,863
  • 8
  • 57
  • 82
h4xxr
  • 11,385
  • 1
  • 39
  • 36
  • Thanks!!.. it's a good idea.. i solve this using ffmpeg, but i wan't something native to iOS SDK .. i will try your idea – matiasfha Nov 17 '10 at 21:08
  • about the first part, do you have any idea how to do it asynchronously? – Vincent Bacalso Sep 10 '11 at 03:24
  • 2
    @BacalsoVincent you wouldn't do the first way if you need it asynchronously... if you only want one image, but asynch, use the second method with an array of only one time :) – h4xxr Nov 13 '11 at 00:18
  • The problem with this method is that it crashes if you background the app. – Dex Apr 20 '12 at 04:42
  • 2
    MPMoviePlayerThumbnailImageRequestDidFinishNotification doesnt work – KETAN May 05 '12 at 08:06
  • Is this method of extracting is believable, I mean does it really work well? Actually I can implement same functionality using iFrameExtractor https://github.com/lajos/iFrameExtractor .But I dont want to go through that method and thinking of implementing this method... So please give me the right direction !! Thanks in advance – Parvez Belim Jul 02 '13 at 05:16
  • I also face the same issue that MPMoviePlayerThumbnailImageRequestDidFinishNotification is not fired – Heena Feb 28 '14 at 11:59
  • MPMoviePlayerController is deprecated in iOS 9. https://developer.apple.com/documentation/mediaplayer/mpmovieplayercontroller?language=objc – tokentoken Jan 15 '20 at 15:11
19

You can also try AVAssetImageGenerator, specifically generateCGImagesAsynchronouslyForTimes:completionHandler.

This SO answer has good example code.

Community
  • 1
  • 1
Steve
  • 970
  • 3
  • 8
  • 19
6

Swift 2 code to take frames with AVAssetImageGenerator:

func previewImageForLocalVideo(url:NSURL) -> UIImage?
{
    let asset = AVAsset(URL: url)
    let imageGenerator = AVAssetImageGenerator(asset: asset)
    imageGenerator.appliesPreferredTrackTransform = true

    var time = asset.duration
    //If possible - take not the first frame (it could be completely black or white on camara's videos)
    time.value = min(time.value, 2)

    do {
        let imageRef = try imageGenerator.copyCGImageAtTime(time, actualTime: nil)
        return UIImage(CGImage: imageRef)
    }
    catch let error as NSError
    {
        print("Image generation failed with error \(error)")
        return nil
    }
}
Avt
  • 16,927
  • 4
  • 52
  • 72
3

Here is code to get FPS images from video

1) Import

#import <Photos/Photos.h>

2) in viewDidLoad

    videoUrl = [NSURL fileURLWithPath:[[NSBundle mainBundle]pathForResource:@"VfE_html5" ofType:@"mp4"]];
    [self createImage:5]; // 5 is frame per second (FPS) you can change FPS as per your requirement.

3) Functions

-(void)createImage:(int)withFPS {
    AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoUrl options:nil];
    AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
    generator.requestedTimeToleranceAfter =  kCMTimeZero;
    generator.requestedTimeToleranceBefore =  kCMTimeZero;

    for (Float64 i = 0; i < CMTimeGetSeconds(asset.duration) *  withFPS ; i++){
        @autoreleasepool {
            CMTime time = CMTimeMake(i, withFPS);
            NSError *err;
            CMTime actualTime;
            CGImageRef image = [generator copyCGImageAtTime:time actualTime:&actualTime error:&err];
            UIImage *generatedImage = [[UIImage alloc] initWithCGImage:image];
            [self savePhotoToAlbum: generatedImage]; // Saves the image on document directory and not memory
            CGImageRelease(image);
        }
    }
}

-(void)savePhotoToAlbum:(UIImage*)imageToSave {

    [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
        PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:imageToSave];
    } completionHandler:^(BOOL success, NSError *error) {
        if (success) {
            NSLog(@"sucess.");
        }
        else {
            NSLog(@"fail.");
        }
    }];
}
Hardik Thakkar
  • 15,269
  • 2
  • 94
  • 81
  • the `generator.requestedTimeToleranceAfter = kCMTimeZero;` `generator.requestedTimeToleranceBefore = kCMTimeZero;` lines seem to be quite important. otherwise the generator defaults to only returning certain time points from the video, no matter what point in time you request. – peter Dec 11 '19 at 00:10
1

In Swift 4 this worked for me with some modifications, mainly changing the "at" parameter of imageGenerator.copyCGImage to a CMTime type:

func showFrame(from file:String) {
    let file = file.components(separatedBy: ".")
    guard let path = Bundle.main.path(forResource: file[0], ofType:file[1]) else {
        debugPrint( "\(file.joined(separator: ".")) not found")
        return
    }
    let url = URL(fileURLWithPath: path)
    let image = previewImageForLocalVideo(url: url)
    let imgView = UIImageView(image: image)
    view.addSubview(imgView)
}    

func previewImageForLocalVideo(url:URL) -> UIImage? {
    let asset = AVAsset(url: url)
    let imageGenerator = AVAssetImageGenerator(asset: asset)
    imageGenerator.appliesPreferredTrackTransform = true
    let tVal = NSValue(time: CMTimeMake(12, 1)) as! CMTime
    do {
        let imageRef = try imageGenerator.copyCGImage(at: tVal, actualTime: nil)
        return UIImage(cgImage: imageRef)
    }
    catch let error as NSError
    {
        print("Image generation failed with error \(error)")
        return nil
    }
}

override func viewDidLoad() {
    super.viewDidLoad()
    showFrame(from:"video.mp4")
}

Source

Mike Lee
  • 2,440
  • 26
  • 12