19

I'm trying to extract frames as UIImages from a video in Swift. I found several Objective C solutions but I'm having trouble finding anything in Swift. Assuming the following is correct can someone either help me to convert the following to Swift or give me their own take on how to do this?

Source: Grabbing the first frame of a video from UIImagePickerController?

- (UIImage *)imageFromVideo:(NSURL *)videoURL atTime:(NSTimeInterval)time {
    
    AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
    NSParameterAssert(asset);
    AVAssetImageGenerator *assetIG =
    [[AVAssetImageGenerator alloc] initWithAsset:asset];
    assetIG.appliesPreferredTrackTransform = YES;
    assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
    
    CGImageRef thumbnailImageRef = NULL;
    CFTimeInterval thumbnailImageTime = time;
    NSError *igError = nil;
    thumbnailImageRef =
    [assetIG copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60)
                    actualTime:NULL
                         error:&igError];
    
    if (!thumbnailImageRef)
        NSLog(@"thumbnailImageGenerationError %@", igError );
    
    UIImage *image = thumbnailImageRef
    ? [[UIImage alloc] initWithCGImage:thumbnailImageRef]
    : nil;
    
    return image;
}
Lightsout
  • 3,454
  • 2
  • 36
  • 65

4 Answers4

29

It actually did work.

func imageFromVideo(url: URL, at time: TimeInterval) -> UIImage? {
    let asset = AVURLAsset(url: url)

    let assetIG = AVAssetImageGenerator(asset: asset)
    assetIG.appliesPreferredTrackTransform = true
    assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels

    let cmTime = CMTime(seconds: time, preferredTimescale: 60)
    let thumbnailImageRef: CGImage
    do {
        thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
    } catch let error {
        print("Error: \(error)")
        return nil
    }

    return UIImage(cgImage: thumbnailImageRef)
}

But remember that this function is synchronous and it's better not to call it on the main queue.

You can do either this:

DispatchQueue.global(qos: .background).async {
    let image = self.imageFromVideo(url: url, at: 0)

    DispatchQueue.main.async {
        self.imageView.image = image
    }
}

Or use generateCGImagesAsynchronously instead of copyCGImage.

Dmitry
  • 2,837
  • 1
  • 30
  • 48
  • @Dmitry using this but image that i get is brighter than the video..can anyone help..??? – Zღk Nov 12 '19 at 09:30
6

Here's a SWIFT 5 alternative to Dmitry's solution, to not have to worry about what queue you're on:

public func imageFromVideo(url: URL, at time: TimeInterval, completion: @escaping (UIImage?) -> Void) {
    DispatchQueue.global(qos: .background).async {
        let asset = AVURLAsset(url: url)

        let assetIG = AVAssetImageGenerator(asset: asset)
        assetIG.appliesPreferredTrackTransform = true
        assetIG.apertureMode = AVAssetImageGenerator.ApertureMode.encodedPixels

        let cmTime = CMTime(seconds: time, preferredTimescale: 60)
        let thumbnailImageRef: CGImage
        do {
            thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
        } catch let error {
            print("Error: \(error)")
            return completion(nil)
        }

        DispatchQueue.main.async {
            completion(UIImage(cgImage: thumbnailImageRef))
        }
    }
}

Here's now to use it:

imageFromVideo(url: videoUrl, at: 0) { image in
   // Do something with the image here
}
Nicolai Harbo
  • 1,064
  • 12
  • 25
2

Here's async/await version of @Dmitry 's answer for those who doesn't like completion handlers

func imageFromVideo(url: URL, at time: TimeInterval) async throws -> UIImage {
        try await withCheckedThrowingContinuation({ continuation in
            DispatchQueue.global(qos: .background).async {
                let asset = AVURLAsset(url: url)
                
                let assetIG = AVAssetImageGenerator(asset: asset)
                assetIG.appliesPreferredTrackTransform = true
                assetIG.apertureMode = AVAssetImageGenerator.ApertureMode.encodedPixels
                
                let cmTime = CMTime(seconds: time, preferredTimescale: 60)
                let thumbnailImageRef: CGImage
                do {
                    thumbnailImageRef = try assetIG.copyCGImage(at: cmTime, actualTime: nil)
                } catch {
                    continuation.resume(throwing: error)
                    return
                }
                continuation.resume(returning: UIImage(cgImage: thumbnailImageRef))
            }
        })
    }

Usage:

let vidUrl = <#your url#>
do {
    let firstFrame = try await imageFromVideo(url: vidUrl, at: 0)
    // do something with image
} catch {
    // handle error
}

Or like this if you're in throwing function:

func someThrowingFunc() throws {
    let vidUrl = <#your url#>
    let firstFrame = try await imageFromVideo(url: vidUrl, at: 0)
    // do something with image
}
mrpaw69
  • 155
  • 9
-1

You can do this easily on iOS. Below is a code snippet on how to do so with Swift.

    let url = Bundle.main.url(forResource: "video_name", withExtension: "mp4")
    let videoAsset = AVAsset(url: url!)
    
    let t1 = CMTime(value: 1, timescale: 1)
    let t2 = CMTime(value: 4, timescale: 1)
    let t3 = CMTime(value: 8, timescale: 1)
    let timesArray = [
        NSValue(time: t1),
        NSValue(time: t2),
        NSValue(time: t3)
    ]
    
    let generator = AVAssetImageGenerator(asset: videoAsset)
    generator.requestedTimeToleranceBefore = .zero
    generator.requestedTimeToleranceAfter = .zero
    
    generator.generateCGImagesAsynchronously(forTimes: timesArray ) { requestedTime, image, actualTime, result, error in
        
       let img = UIImage(cgImage: image!)
       
    }

You can find the demo code here and the medium article here.

Dharman
  • 30,962
  • 25
  • 85
  • 135
Ruchira Randana
  • 4,021
  • 1
  • 27
  • 24