21

I have implemented an AVPlayer and i want to take an image or thumbnail when clicking on a toolbar button and open in a new UIViewController with UIImageView. The image should be scaled exactly like the AVPlayer. The segue is already working, i just have to implement that i get the image at the current play time.

Thanks!

kchromik
  • 1,348
  • 2
  • 13
  • 42

8 Answers8

55

Objective-C

AVAsset *asset = [AVAsset assetWithURL:sourceURL];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = CMTimeMake(1, 1);
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);  // CGImageRef won't be released by ARC

Swift

var asset = AVAsset.assetWithURL(sourceURL)
var imageGenerator = AVAssetImageGenerator(asset: asset!)
var time = CMTimeMake(1, 1)
var imageRef = try! imageGenerator!.copyCGImageAtTime(time, actualTime: nil)
var thumbnail = UIImage.imageWithCGImage(imageRef)
CGImageRelease(imageRef) // CGImageRef won't be released by ARC 

Swift 3.0

var sourceURL = URL(string: "Your Asset URL")
var asset = AVAsset(url: sourceURL!)
var imageGenerator = AVAssetImageGenerator(asset: asset)
var time = CMTimeMake(1, 1)
var imageRef = try! imageGenerator.copyCGImage(at: time, actualTime: nil)
var thumbnail = UIImage(cgImage:imageRef)

Note : Interpret Swift code according to your swift version.

Dipen Panchasara
  • 13,480
  • 5
  • 47
  • 57
  • Taking the screenshot itself is working with both methods, but the problem is, that the screenshot is scaled as fullscreen and not on the same format like my video. The is a possibility to adapt the size of the 'thumbnail'? – kchromik Mar 12 '13 at 13:16
  • 1
    Gives me AVErrorOperationNotSupportedForAsset error code. I'm trying to load m3u8 server url. Please help – ruyamonis346 Aug 02 '13 at 07:58
  • 2
    to load m3u8 file you require `AVPlayer` which is capable to play .m3u8 streaming file. try to play using `AVPlayer`. – Dipen Panchasara Aug 03 '13 at 12:14
  • Add: #import – evya Nov 07 '13 at 15:11
  • this works, great but how do you change the time? I have been playing around with the time variable and CMTime but can't seem to figure out the relatioship, and how to take a screenshot lets say 5 seconds in 10 seconds in – iqueqiorio Apr 11 '15 at 04:15
  • @DipenPanchasara I want to take thumbnail using m3u8, But its not working getting nill all the time!! – IamDev Dec 23 '16 at 07:15
  • m3u8 is a playlist file which takes some time to start playback, you can capture screenshot once `AVPlayer` is in `readyToPlay` state. You need to add observer refer this [SO Post - knowing-when-avplayer-object-is-ready-to-play](http://stackoverflow.com/questions/5401437/knowing-when-avplayer-object-is-ready-to-play). You can't directly jump to specific time in player while playing streaming file. I hope it help you. – Dipen Panchasara Dec 23 '16 at 07:57
  • @DipenPanchasara, Does it work for high resolution image url ? – Bucket Jan 03 '17 at 15:10
  • 1
    I am getting a vertically inverted image. Any idea why?? – Cristian Pena May 22 '17 at 14:48
  • @DipenPanchasara I'm getting NSOSStatusErrorDomain Code=-12792 error . Any ideas ? Thanks in advance. – Sayooj Sep 20 '19 at 10:18
3

Try this

 - (UIImage*)takeScreeenShot {

 AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:vidURL
 options:nil];

 AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];

 imageGenerator.appliesPreferredTrackTransform = YES;

 NSError *err = NULL;

 CMTime time = CMTimeMake(1, 60); // time range in which you want
 screenshot

 CGImageRef imgRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL
 error:&err];

 return [[UIImage alloc] initWithCGImage:imgRef];

 }

Hope this helps !!!

rahulg
  • 2,183
  • 3
  • 33
  • 47
arun.s
  • 1,528
  • 9
  • 12
1

Swift 2.x:

let asset = AVAsset(...)
let imageGenerator = AVAssetImageGenerator(asset: asset)
let screenshotTime = CMTime(seconds: 1, preferredTimescale: 1)
if let imageRef = try? imageGenerator.copyCGImageAtTime(screenshotTime, actualTime: nil) {

    let image = UIImage(CGImage: imageRef)

    // do something with your image
}
JAL
  • 41,701
  • 23
  • 172
  • 300
0

Add below code to generate thumbnail from video.

AVURLAsset *assetURL = [[AVURLAsset alloc] initWithURL:partOneUrl options:nil]; 

AVAssetImageGenerator *assetGenerator = [[AVAssetImageGenerator alloc] initWithAsset:assetURL]; 

assetGenerator.appliesPreferredTrackTransform = YES;  

NSError *err = NULL; 

CMTime time = CMTimeMake(1, 2);     

CGImageRef imgRef = [assetGenerator copyCGImageAtTime:time actualTime:NULL error:&err]; 

UIImage *one = [[UIImage alloc] initWithCGImage:imgRef];    
FelixSFD
  • 6,052
  • 10
  • 43
  • 117
0

This is how I get a shot of the current visible frame on the scene in Swift:

The key is to

  1. get the current time of the player which is of type CMTime
  2. convert that time into seconds of type Float64
  3. switch the secondss back to CMTime using CMTimeMake. The first parameter which would be where the seconds goes should be cast to Int64

Code:

var myImage: UIImage?

guard let player = player else { return }

let currentTime: CMTime = player.currentTime() // step 1.
let currentTimeInSecs: Float64 = CMTimeGetSeconds(currentTime) // step 2.
let actionTime: CMTime = CMTimeMake(Int64(currentTimeInSecs), 1) // step 3.

let asset = AVAsset(url: fileUrl)
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true // prevent image rotation

do{
    let imageRef =  try imageGenerator.copyCGImage(at: actionTime, actualTime: nil)

     myImage = UIImage(cgImage: imageRef)
}catch let err as NSError{
    print(err.localizedDescription)
}
Lance Samaria
  • 17,576
  • 18
  • 108
  • 256
0

Swift extension for generating thumbnails from video

extension AVPlayer {
    func generateThumbnail(time: CMTime) -> UIImage? {
        guard let asset = currentItem?.asset else { return nil }
        let imageGenerator = AVAssetImageGenerator(asset: asset)

        do {
            let cgImage = try imageGenerator.copyCGImage(at: time, actualTime: nil)
            return UIImage(cgImage: cgImage)
        } catch {
            print(error.localizedDescription)
        }

        return nil
    }
}
0

When you need to create multiple thumbnails at once the class AVAssetImageGenerator is golden, as it provides an async way.

If you need a Thumbnail-Image of the player's current frame, simply render it's View (platform specific) or its Layer (platform independent):

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGSize frameSize = _playerLayer.frame.size;
CGContextRef thumbnailContext = CGBitmapContextCreate(nil, frameSize.width, frameSize.height, 8, 0, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGColorSpaceRelease(colorSpace);
[_playerLayer renderInContext:thumbnailContext];
CGImageRef playerThumbnail = CGBitmapContextCreateImage(thumbnailContext);
CGContextRelease(thumbnailContext);

This is super fast and works synchronously.

Karsten
  • 2,772
  • 17
  • 22
0

Code for 2022:

seconds = .. the normal human-meaning desired time position in the video

guard let pl = .. your player ..
guard let ite = pl.currentItem ..

let testGen = AVAssetImageGenerator(asset: ite.asset)
testGen.maximumSize = CGSize(width: 0, height: .. height of your preview box)

testGen.requestedTimeToleranceBefore = .zero // during development
// or something like ... CMTime(value: .. your tolerance .., timescale: 600)

testGen.requestedTimeToleranceAfter = .zero // during development
// ditto

if #available(tvOS 16, *) {
    Task { [weak self] ..
        do {
            let ct = CMTime(value: CMTimeValue(seconds), timescale: 1)
            // NOTE THE "1"

            let (foundImage, foundTime) = try await testGen.image(at: ct)
            
            let foundAsSecs = CMTimeGetSeconds(foundTime)
            print("tried gen at \(seconds) found as \(foundAsSecs) \n")
            
            self. .. your preview .image = UIImage(cgImage: foundImage)
            
        } catch {
            print("gen err \(error)")
        }
    }
}

Setting the two tolerances is a sophisticated issue, google.

Watch out for the gotchya where timescale of 1 is needed for the CMTime.

Fattie
  • 27,874
  • 70
  • 431
  • 719