10

I'm trying to get two videos to play sequentially. I've tried AVQueuePlayer but there's a huge "burp" between the two clips. I need to them to play without interruption.

So I'm trying to use AVMutableComposition and an AVPlayer but can't get it right.

Here's my code (ignore memory leaks, just testing in an empty project..):

composition = [[AVMutableComposition alloc] init];

NSString * path = [[NSBundle mainBundle] pathForResource:@"test" ofType:@"mp4"];
NSURL * url = [NSURL fileURLWithPath:path];
AVURLAsset * asset = [[AVURLAsset alloc] initWithURL:url options:nil];

NSError * error = NULL;
[composition insertTimeRange:CMTimeRangeMake(CMTimeMake(0,1000),CMTimeMake(4,1000)) ofAsset:asset atTime:CMTimeMake(0,1000) error:&error];
if(error) NSLog(@"error: %@",error);

path = [[NSBundle mainBundle] pathForResource:@"chug1" ofType:@"mp4"];
url = [NSURL fileURLWithPath:path];
asset = [[AVURLAsset alloc] initWithURL:url options:nil];

error = NULL;
[composition insertTimeRange:CMTimeRangeMake(CMTimeMake(0,1000),CMTimeMake(3,1000)) ofAsset:asset atTime:CMTimeMake(4.1,1000) error:&error];
if(error) NSLog(@"error: %@",error);

AVPlayerItem * item = [[AVPlayerItem alloc] initWithAsset:composition];
AVPlayer * player = [AVPlayer playerWithPlayerItem:item];
AVPlayerLayer * layer = [AVPlayerLayer playerLayerWithPlayer:player];

[layer setFrame:CGRectMake(0, 0, 320, 480)];
[[[self view] layer] addSublayer:layer];
[player play];

The code seems right to me. The first frame of each video is actually rendered to the screen. But the video doesn't play at all. I'm I missing something? Do I need to figure out how to use the MutableTrack stuff?

GEOCHET
  • 21,119
  • 15
  • 74
  • 98
gngrwzrd
  • 5,902
  • 4
  • 43
  • 56
  • Hey @gngwzrd I am also getting this problem but I dont have much experience and I am new to iPhone development. Can you please help me in this. I want to play multiple videos from the network smoothly without any gap in the clips and then I also need to seek them properly. Can you help me getting through this issue? – Omer Waqas Khan Apr 27 '12 at 07:40
  • I ended up not using video to gapless playback, the best solution is using PNG sequences with separate audio files. – gngrwzrd May 06 '12 at 17:27
  • Oh ok and what about the seeking issue, have you got it solved, like seeking the video which is being streaming without downloading it ? – Omer Waqas Khan May 06 '12 at 17:48
  • No for our project the assets were either local on the device, or they were downloaded and saved locally to the device before playing them. And since we used PNG sequences seeking is a non-issue anymore. Checkout "Ozgood" in the app store. Not sure if this is exactly what you're looking for, but you can pick apart an app like "Talking Tom" to see how it's done. In iTunes copy the app to your desktop, show package contents and start snooping around, you'll see how they do it. – gngrwzrd May 06 '12 at 18:15
  • Thanks bro :) ... so rite now there is no solution for streaming multiple files smoothly without gap in an iphone app. – Omer Waqas Khan May 06 '12 at 18:47
  • Yep. But even playing local video files on the device will not guarantee gapless playback or smooth seeking. The overhead required to decode video guarantees you're going to have a minimum amount of _noticeable_ latency. Add onto that trying to do it over a slow 3G connection, one of your users will have less than optimal experience. – gngrwzrd May 06 '12 at 19:04
  • I think the time ranges are way too short. `CMTimeMake(4, 1000)` means 0.004 seconds. The right thing to do is to use the asset's duration, as djromero's answer below demonstrates. – Randall Cook Sep 25 '13 at 22:07

1 Answers1

9

Maybe you're using the wrong time insertion points and durations, both depends on actual video assets. I'd write something like this:

CMTime insertionPoint = kCMTimeZero;
NSError * error = nil;
composition = [AVMutableComposition composition];
asset = /* obtain asset #1 */
if (![composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) 
                          ofAsset:asset 
                           atTime:insertionPoint 
                            error:&error]) 
{
    NSLog(@"error: %@",error);
}
insertionPoint = CMTimeAdd(insertionPoint, asset.duration);

asset = /* obtain asset #2 */
if (![composition insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) 
                          ofAsset:asset 
                           atTime:insertionPoint 
                            error:&error]) 
{
    NSLog(@"error: %@",error);
}
...
/* playback stuff */
djromero
  • 19,551
  • 4
  • 71
  • 68