19

I have an application that I am developing for the iPhone. What it does is, it captures the video from the camera and stores the video file onto the File System.

I need to create a Thumbnail Image of this video file before I save the Image to the File System. My motive is to show a list of thumbnails of the created video so that the user can select a specific thumbnail to play the desired file.

Could someone please advise on how I can create a Thumbnail image of a video file that has been captured by the Camera.

Also, can you please advise if I can create a Thumbnail of an existing video file using iOS SDK.

Pang
  • 9,564
  • 146
  • 81
  • 122
Abishek
  • 11,191
  • 19
  • 72
  • 111
  • possible duplicate of [How to take a screenshot programmatically](http://stackoverflow.com/questions/2200736/how-to-take-a-screenshot-programmatically) – jscs May 11 '11 at 18:22
  • 1
    I was actually referring to generating a thumbnail of a video file on IOS and not taking a screenshot programatically. To take a screenshot, one would have to play the file, but in my case playing the file is not necessary. – Abishek Jun 16 '11 at 03:24

5 Answers5

35

A better solution actually is to use the AVFoundation framework to do this. It bypasses the need to construct an MPMoviePlayerController which causes the problem that the Iris of the camera remains closed if used in conjuction with the UIImagePickerController (at least that's what I experienced).

The code I use:

+ (UIImage *)thumbnailFromVideoAtURL:(NSURL *)contentURL {
    UIImage *theImage = nil;
    AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:contentURL options:nil];
    AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
    generator.appliesPreferredTrackTransform = YES;
    NSError *err = NULL;
    CMTime time = CMTimeMake(1, 60);
    CGImageRef imgRef = [generator copyCGImageAtTime:time actualTime:NULL error:&err];

    theImage = [[[UIImage alloc] initWithCGImage:imgRef] autorelease];

    CGImageRelease(imgRef);
    [asset release];
    [generator release];

    return theImage;
}
Werner Altewischer
  • 10,080
  • 4
  • 53
  • 60
  • Thumbnail at the end, AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset]; CMTime duration = playerItem.duration; – Underdog Sep 23 '13 at 01:07
  • @Underdog You don't need an `AVPlayerItem` for that; just fetch the duration from `AVAsset` directly. – Ja͢ck Nov 20 '14 at 07:34
19

Try this (it doesn't actually show the movie player):

+ (UIImage *)imageFromMovie:(NSURL *)movieURL atTime:(NSTimeInterval)time {
  // set up the movie player
  MPMoviePlayerController *mp = [[MPMoviePlayerController alloc] 
    initWithContentURL:movieURL];
  mp.shouldAutoplay = NO;
  mp.initialPlaybackTime = time;
  mp.currentPlaybackTime = time;
  // get the thumbnail
  UIImage *thumbnail = [mp thumbnailImageAtTime:time 
                           timeOption:MPMovieTimeOptionNearestKeyFrame];
  // clean up the movie player
  [mp stop];
  [mp release];
  return(thumbnail);
}

It's supposed to be a synchronous call, so it might block the main thread some, but seems to be running pretty instantly for me when I use a time at the beginning of the movie. If you're doing this a lot, you can add it as a category on UIImage, which is what I did.

I see from your question that you want to do this before the movie is saved, and I guess it might not work without a file url. However, if you're using the UIImagePickerController for camera capture, you can pass this function the URL returned in the info dictionary of imagePickerController:didFinishPickingMediaWithInfo: with the key UIImagePickerControllerMediaURL.

Jesse Crossen
  • 6,945
  • 2
  • 31
  • 32
3

Very simple try this...

Step 1: Import header #import <MediaPlayer/MediaPlayer.h>

Step 2: Get url path

NSURL *videoURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"Sample" ofType:@"m4v"]];

Step 3: Finally get thumbnail

- (UIImage *)VideoThumbNail:(NSURL *)videoURL
{
    MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];
    UIImage *thumbnail = [player thumbnailImageAtTime:52.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
    [player stop];
    return thumbnail;
}
Rajesh Loganathan
  • 11,129
  • 4
  • 78
  • 90
1

Code for solution that uses AVFoundation framework and Swift 3.0 (commented code isn't necessary and is discussed below the code - you have to decide whether you need it or not):

import AVFoundation

func generateThumbnailForVideo(at url: URL) -> UIImage? {
    let kPreferredTimescale: Int32 = 1000
    let asset = AVURLAsset(url: url)
    let generator = AVAssetImageGenerator(asset: asset)
    generator.appliesPreferredTrackTransform = true
    //generator.requestedTimeToleranceBefore = kCMTimeZero
    //generator.requestedTimeToleranceAfter = kCMTimeZero
    //generator.maximumSize = CGSize(width: 100, height: 100)

    var actualTime: CMTime = CMTime(seconds: 0, preferredTimescale: kPreferredTimescale)
    //generates thumbnail at first second of the video
    let cgImage = try? generator.copyCGImage(at: CMTime(seconds: 1, preferredTimescale: kPreferredTimescale), actualTime: &actualTime)
    return cgImage.flatMap() { return UIImage(cgImage: $0, scale: UIScreen.main.scale, orientation: .up) }
}

Note that you may consider running this code on background thread as thumbnail creation can be potentially a costly operation.

Also, please take a look at some of the properties of AVAssetImageGenerator class:

  1. requestedTimeToleranceBefore (Apple's documentation):

The maximum length of time before a requested time for which an image may be generated.

The default value is kCMTimePositiveInfinity.

Set the values of requestedTimeToleranceBefore and requestedTimeToleranceAfter to kCMTimeZero to request frame-accurate image generation; this may incur additional decoding delay.

  1. requestedTimeToleranceAfter (Apple's documentation):

The maximum length of time after a requested time for which an image may be generated.

The default value is kCMTimePositiveInfinity.

Set the values of requestedTimeToleranceBefore and requestedTimeToleranceAfter to kCMTimeZero to request frame-accurate image generation; this may incur additional decoding delay.

  1. maximumSize (Apple's documentation):

Specifies the maximum dimensions for generated image.

The default value is CGSizeZero, which specifies the asset’s unscaled dimensions.

AVAssetImageGenerator scales images such that they fit within the defined bounding box. Images are never scaled up. The aspect ratio of the scaled image is defined by the apertureMode property.

Rafał Augustyniak
  • 2,011
  • 19
  • 14
0

Try this :

generate.requestedTimeToleranceBefore = kCMTimeZero;
generate.requestedTimeToleranceAfter = kCMTimeZero;

Needs to added to get correct frame.

Bhumika
  • 876
  • 7
  • 20
Ayush Goel
  • 1,103
  • 1
  • 7
  • 7