You're talking about using the calls for generating what Apple calls thumbnail images from videos at specific times.
For an MPMoviePlayerController (what iOS uses to hold a video from a file or other source), there are two commands to do this. The first one generates a single thumbnail (image) from a movie at a specific point in time, and the second one generates a set of thumbnails for a time range.
This example gets an image at 10 seconds into a movie clip, myMovie.mp4:
MPMoviePlayerController *movie = [[MPMoviePlayerController alloc]
initWithContentURL:[NSURL URLWithString:@"myMovie.mp4"]];
UIImage *singleFrameImage = [movie thumbnailImageAtTime:10
timeOption:MPMovieTimeOptionExact];
Note that this performs synchronously - i.e. the user will be forced to wait while you get the screenshot.
The other option is to get a series of images from a movie, from an array of times:
MPMoviePlayerController *movie = [[MPMoviePlayerController alloc]
initWithContentURL [NSURL URLWithString:@"myMovie.mp4"]];
NSNumber time1 = 10;
NSNumber time2 = 11;
NSNumber time3 = 12;
NSArray *times = [NSArray arrayWithObjects:time1,time2,time3,nil];
[movie requestThumbnailImagesAtTimes:times timeOption:MPMovieTimeOptionExact];
This second way will trigger a notification of type MPMoviePlayerThumbnailImageRequestDidFinishNotification
each time a new image is generated. You can set up an observer to monitor this and process the image - I'll leave you to work that bit out on your own!