15

I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.

I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as UIImage. That's it.

My video is a m3u8 file located on the internet, it is played normally in the AVPlayerLayer without any problems.

What have I tried:

  1. AVAssetImageGenerator. It is not working, the method copyCGImageAtTime:actualTime: error: returns null image ref. According to the answer here AVAssetImageGenerator doesn't work for streaming videos.
  2. Taking snapshot of the player view. I tried first renderInContext: on AVPlayerLayer, but then I realized that it is not rendering this kind of "special" layers. Then I found a new method introduced in iOS 7 - drawViewHierarchyInRect:afterScreenUpdates: which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown.
  3. AVPlayerItemVideoOutput. I have added a video output for my AVPlayerItem, however whenever I call hasNewPixelBufferForItemTime: it returns NO. I guess the problem is again streaming video and I am not alone with this problem.
  4. AVAssetReader. I was thinking to try it but decided not to lose time after finding a related question here.

So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? I can't believe this.

Community
  • 1
  • 1
frangulyan
  • 3,580
  • 4
  • 36
  • 64

2 Answers2

11

AVAssetImageGenerator is the best way to snapshot a video, this method return asynchronously a UIImage :

import AVFoundation

// ...

var player:AVPlayer? = // ...

func screenshot(handler:@escaping ((UIImage)->Void)) {
    guard let player = player ,
        let asset = player.currentItem?.asset else {
            return
    }
    
    let imageGenerator = AVAssetImageGenerator(asset: asset)
    imageGenerator.appliesPreferredTrackTransform = true
    let times = [NSValue(time:player.currentTime())]
    
    imageGenerator.generateCGImagesAsynchronously(forTimes: times) { _, image, _, _, _ in
        if let img = image {
            handler(UIImage(cgImage: img))
        }
    }
}

(It's Swift 4.2)

Bhadresh Kathiriya
  • 3,147
  • 2
  • 21
  • 41
Axel Guilmin
  • 11,454
  • 9
  • 54
  • 64
  • @ Axel Guilmin This captures only Avplayer. What if i want to take screenshot of both Avplayer and UiView? – Raghuram Feb 07 '17 at 06:56
  • I don't think my answer would be the right approach to capture a UIView. I did not test it but this answer seems better : http://stackoverflow.com/a/4334902/1327557 – Axel Guilmin Feb 07 '17 at 09:19
  • @ Axel Guilmin Thank you for your reply. See this is my problem: stackoverflow.com/questions/42085479/… – Raghuram Feb 07 '17 at 09:35
  • 1
    @Anessence were you able to find an answer that captures from livestreams? – mfaani Jun 14 '19 at 01:38
9

AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.

This answer mostly cribbed from here

#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface ViewController ()

@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;

@end

@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
    NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
    self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
    self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
    [self.playerItem addOutput:self.playerOutput];
    self.player = [AVPlayer playerWithPlayerItem:self.playerItem];

    AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
    playerLayer.frame = self.view.frame;
    [self.view.layer addSublayer:playerLayer];

    [self.player play];
}

- (IBAction)grabFrame {
    CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
    NSLog(@"The image: %@", buffer);
}

- (void)viewDidLoad {
    [super viewDidLoad];


    NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
    AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];

    [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{

        NSError* error = nil;
        AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
        if (status == AVKeyValueStatusLoaded)
        {
            dispatch_async(dispatch_get_main_queue(), ^{
                [self setupPlayerWithLoadedAsset:asset];
            });
        }
        else
        {
            NSLog(@"%@ Failed to load the tracks.", self);
        }
    }];
}

@end
Community
  • 1
  • 1
Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • I will try it now and let you know. A short question: should I setup the `AVPlayerItemVideoOutput` object right from the beginning? My code is playing a video without the output added and then, whenever I need a snapshot, I quickly create a `AVPlayerItemVideoOutput` object, add it to the player item and try to read the pixel buffer. I also tried adding output a bit earlier - whenever my special snapshot gesture was starting the touches but not yet recognized. Is this important? – frangulyan Sep 18 '16 at 11:27
  • I think you must set up the `AVPlayerItemVideoOutput` from the beginning, probably before you start playback. – Rhythmic Fistman Sep 18 '16 at 12:24
  • Thanks for your solution, I just checked and it works! The trick was to add `AVPlayerItemVideoOutput` before I start to play, as you said. Seems a bit inefficient to have a video output added the whole time just for one screenshot somewhere in the future, which in most of the cases will not even be taken, but at least it works! – frangulyan Sep 18 '16 at 13:40
  • You're welcome. I guess you're right - attaching an `ARGB` `AVPlayerItemVideoOutput` to what may very well be a `YUV` flow could be expensive. I'd never thought of that. – Rhythmic Fistman Sep 18 '16 at 13:45
  • Did this solution work for you with FairPlay protected HLS? I've tried copyPixelBufferForItemTime and it works great with unprotected streams, but once you use FairPlay it returns NULL. http://stackoverflow.com/questions/42839831/fairplay-streaming-calling-copypixelbufferforitemtime-on-avplayeritemvideooutpu – schmittsfn Apr 04 '17 at 14:50