I am trying to play a video in a loop on a AVSampleBufferDisplayLayer. I can get it to play though once with no problem. But, when I try to loop it, it doesn't keep playing.
Per the answer to AVFoundation to reproduce a video loop there isn't a way to rewind the AVAssetReader so I re-create it. (I did see the answer for Looping a video with AVFoundation AVPlayer? but AVPlayer is more full-features. I am reading for a file, but want the AVSampleBufferDisplayLayer still.)
One hypothesis is that I need to stop some of the H264 headers, but I have no idea if that'll help (and how). Another is that it has something to do with the CMTimebase, but I've tried several things to no avail.
Code below, based on Apple's WWDC talk on Direct Access to Video Encoding:
- (void)viewDidLoad {
[super viewDidLoad];
NSString *filepath = [[NSBundle mainBundle] pathForResource:@"sample-mp4" ofType:@"mp4"];
NSURL *fileURL = [NSURL fileURLWithPath:filepath];
AVAsset *asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];
UIView *view = self.view;
self.videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
self.videoLayer.bounds = view.bounds;
self.videoLayer.position = CGPointMake(CGRectGetMidX(view.bounds), CGRectGetMidY(view.bounds));
self.videoLayer.videoGravity = AVLayerVideoGravityResizeAspect;
self.videoLayer.backgroundColor = [[UIColor greenColor] CGColor];
CMTimebaseRef controlTimebase;
CMTimebaseCreateWithMasterClock( CFAllocatorGetDefault(), CMClockGetHostTimeClock(), &controlTimebase );
self.videoLayer.controlTimebase = controlTimebase;
CMTimebaseSetTime(self.videoLayer.controlTimebase, CMTimeMake(5, 1));
CMTimebaseSetRate(self.videoLayer.controlTimebase, 1.0);
[[view layer] addSublayer:_videoLayer];
dispatch_queue_t assetQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0); //??? right queue?
__block AVAssetReader *assetReaderVideo = [self createAssetReader:asset];
__block AVAssetReaderTrackOutput *outVideo = [assetReaderVideo outputs][0];
if( [assetReaderVideo startReading] )
{
[_videoLayer requestMediaDataWhenReadyOnQueue: assetQueue usingBlock: ^{
while( [_videoLayer isReadyForMoreMediaData] )
{
CMSampleBufferRef sampleVideo;
if ( ([assetReaderVideo status] == AVAssetReaderStatusReading) && ( sampleVideo = [outVideo copyNextSampleBuffer]) ) {
[_videoLayer enqueueSampleBuffer:sampleVideo];
CFRelease(sampleVideo);
CMTimeShow(CMTimebaseGetTime(_videoLayer.controlTimebase));
}
else {
[_videoLayer stopRequestingMediaData];
//CMTimebaseSetTime(_videoLayer.controlTimebase, CMTimeMake(5, 1));
//CMTimebaseSetRate(self.videoLayer.controlTimebase, 1.0);
//CMTimeShow(CMTimebaseGetTime(_videoLayer.controlTimebase));
assetReaderVideo = [self createAssetReader:asset];
outVideo = [assetReaderVideo outputs][0];
[assetReaderVideo startReading];
//sampleVideo = [outVideo copyNextSampleBuffer];
//[_videoLayer enqueueSampleBuffer:sampleVideo];
}
}
}];
}
}
-(AVAssetReader *)createAssetReader:(AVAsset*)asset {
NSError *error=nil;
AVAssetReader *assetReaderVideo = [[AVAssetReader alloc] initWithAsset:asset error:&error];
NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
AVAssetReaderTrackOutput *outVideo = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTracks[0] outputSettings:nil]; //dic];
[outVideo res]
[assetReaderVideo addOutput:outVideo];
return assetReaderVideo;
}
Thanks so much.