55

I am trying to take a screenshot of an AVPlayer inside a bigger view. I want to build a testing framework only, so private APIs or any method is good, because the framework will not be included when releasing to the AppStore.

I have tried with using

  • UIGetScreenImage() : works well on simulator but not on device
  • snapshotviewafterscreenupdates: it shows the view but I cannot create a UIImage from that.
  • drawViewInHierarchy and renderInContext will not work with AVPlayer
  • I don't want to use AVAssetGenerator for getting image from video, it is hard to get a good coordinate or the video player as the subview of other views
mfaani
  • 33,269
  • 19
  • 164
  • 293
vodkhang
  • 18,639
  • 11
  • 76
  • 110

6 Answers6

10

I know you don't want to use the AVAssetImageGenerator but I've also researched this extensively and I believe the only solution currently is using the AVAssetImageGenerator. It's not that difficult as you say to get the right coordinate because you should be able to get the current time of your player. In my App the following code works perfectly:

-(UIImage *)getAVPlayerScreenshot 
{
    AVURLAsset *asset = (AVURLAsset *)self.playerItem.asset;
    AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
    imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
    imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
    CGImageRef thumb = [imageGenerator copyCGImageAtTime:self.playerItem.currentTime
                                              actualTime:NULL
                                                   error:NULL];
    UIImage *videoImage = [UIImage imageWithCGImage:thumb];
    CGImageRelease(thumb);
    return videoImage;
}
Bob de Graaf
  • 2,630
  • 1
  • 25
  • 43
  • The OP specified explicitly that they didn't want to use an AVAssetImageGenerator. – JAL Aug 22 '16 at 16:08
  • 1
    @JAL yes but it's simply the only way right now and I gave a solution for the reason he didn't want to use it. So I believe this is still the best answer possible. – Bob de Graaf Aug 23 '16 at 11:36
  • 3
    @JAL the OP can specify what they want and sometimes the answer is you can't. you have to do it the way you don't want. Or these are the benefits of doing it this way. Regardless of the OP not enjoying, others can benefit from the answer – mfaani Jun 14 '19 at 01:42
1

AVPlayer rending videos using GPU, so you cannot capture it using core graphics methods.

However that’s possible to capture images with AVAssetImageGenerator, you need specify a CMTime.


Update:

Forget to take a screenshot of the entire screen. AVPlayerItemVideoOutput is my final choice, it supports video steam.

Here is my full implementation: https://github.com/BB9z/ZFPlayer/commit/a32c7244f630e69643336b65351463e00e712c7f#diff-2d23591c151edd3536066df7c18e59deR448

BB9z
  • 2,432
  • 1
  • 30
  • 36
  • The OP specified explicitly that they didn't want to use an AVAssetImageGenerator. – JAL Apr 08 '16 at 03:52
  • Is there any documented mentions about why CoreGraphics doesn't work with AVPlayer? – Varrry Jan 23 '18 at 16:13
  • 1
    Thank you very much for this answer. Works like a charm all other solutions are not working for real device, especially when its streaming... – Viktor Todorov Jul 13 '20 at 09:32
1

Swift version of Bob's answer below. I'm using AVQueuePlayer but should work for regular AVPlayer too.

public func getImageSnapshot() -> UIImage? {
    guard let asset = player.currentItem?.asset else { return nil }
    
    let imageGenerator = AVAssetImageGenerator(asset: asset);
    imageGenerator.requestedTimeToleranceAfter = CMTime.zero;
    imageGenerator.requestedTimeToleranceBefore = CMTime.zero;
    
    do {
        let thumb = try imageGenerator.copyCGImage(at: player.currentTime(), actualTime: nil);
        let image = UIImage(cgImage: thumb);
        return image;
    } catch {
        print("⛔️ Failed to get video snapshot: \(error)");
    }
    return nil;
}
N S
  • 2,524
  • 6
  • 32
  • 42
0

Here is code for taking a screenshot of you entire screen, including the AVPlayer. You only need to add a UIImageView on top of your videoplayer, which stays hidden until we take the screenshot and then we hide it again.

func takeScreenshot() -> UIImage? {
    //1 Hide all UI you do not want on the screenshot
    self.hideButtonsForScreenshot()

    //2 Create an screenshot from your AVPlayer
    if let url = (self.overlayPlayer?.currentItem?.asset as? AVURLAsset)?.url {

          let asset = AVAsset(url: url)

          let imageGenerator = AVAssetImageGenerator(asset: asset)
          imageGenerator.requestedTimeToleranceAfter = CMTime.zero
          imageGenerator.requestedTimeToleranceBefore = CMTime.zero

        if let thumb: CGImage = try? imageGenerator.copyCGImage(at: self.overlayPlayer!.currentTime(), actualTime: nil) {
            let videoImage = UIImage(cgImage: thumb)
            //Note: create an image view on top of you videoPlayer in the exact dimensions, and display it before taking the screenshot
            // mine is created in the storyboard
            // 3 Put the image from the screenshot in your screenshotPhotoView and unhide it
            self.screenshotPhotoView.image = videoImage
            self.screenshotPhotoView.isHidden = false
        }
    }
    
    //4 Take the screenshot
    let bounds = UIScreen.main.bounds
    UIGraphicsBeginImageContextWithOptions(bounds.size, true, 0.0)
    self.view.drawHierarchy(in: bounds, afterScreenUpdates: true)
    let image = UIGraphicsGetImageFromCurrentImageContext()
    UIGraphicsEndImageContext()
    
    //5 show all UI again that you didn't want on your screenshot
    self.showButtonsForScreenshot()
    //6 Now hide the screenshotPhotoView again
    self.screenshotPhotoView.isHidden = true
    self.screenshotPhotoView.image = nil
    return image
}
Mikkel Cortnum
  • 481
  • 4
  • 11
-3

If you want to take screenshot of current screen just call following method on any action event which give you Image object.

-(UIImage *) screenshot
{
    UIGraphicsBeginImageContext(self.view.bounds.size);
    [self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
    UIImage *sourceImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    //now we will position the image, X/Y away from top left corner to get the portion we want
    UIGraphicsBeginImageContext(sourceImage.size);
    [sourceImage drawAtPoint:CGPointMake(0, 0)];
    UIImage *croppedImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    //To write image on divice.
    //UIImageWriteToSavedPhotosAlbum(croppedImage,nil, nil, nil);

    return croppedImage;
}

Hope this will help you.

Suresh.Chandgude
  • 306
  • 3
  • 10
  • This is almost identical to [this answer](http://stackoverflow.com/a/23675253/2415822), and `renderInContext` will not work with `AVPlayer`. – JAL Sep 09 '16 at 15:55
  • - (UIImage*)imageFromVideoAtPath:(NSString *)path atTime:(NSTimeInterval)time { NSURL *videoURL = [NSURL fileURLWithPath:path]; MPMoviePlayerController *moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:videoURL]; [moviePlayer prepareToPlay]; UIImage *thumbnail = [moviePlayer thumbnailImageAtTime:time timeOption:MPMovieTimeOptionNearestKeyFrame]; [moviePlayer stop]; return thumbnail; } try this. – Suresh.Chandgude Sep 09 '16 at 16:38
-5
CGRect grabRect = CGRectMake(0,0,320,568);// size you want to take screenshot of. 

if ([[UIScreen mainScreen] respondsToSelector:@selector(scale)]) {
   UIGraphicsBeginImageContextWithOptions(grabRect.size, NO, [UIScreen mainScreen].scale);
 } else {
      UIGraphicsBeginImageContext(grabRect.size);
}
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(ctx, -grabRect.origin.x, -grabRect.origin.y);
[self.view.layer renderInContext:ctx];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
nutz
  • 212
  • 1
  • 4