11

I want to share my knowledge which I worked out in some days about it. There isnt a lot to find about it.

I am still fizzeling about the sound. Comments and tips are welcomed. ;-)

Karsten
  • 1,869
  • 22
  • 38
  • Hey Karsten! Me and my team working on a pretty basic AVSampleBufferDisplayLayer + AVAssetReader for the past few days. We got great progress but stuck on some issues. We'd love help if possible (paid if needed). Any chance I can contact you some how? – Roi Mulia Jun 24 '19 at 22:22

2 Answers2

14

here my code snippets. Declare it

@property (nonatomic, retain) AVSampleBufferDisplayLayer *videoLayer;

at first setup the video layer

self.videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
self.videoLayer.bounds = self.bounds;
self.videoLayer.position = CGPointMake(CGRectGetMidX(self.bounds), CGRectGetMidY(self.bounds));
self.videoLayer.videoGravity = AVLayerVideoGravityResizeAspect;
self.videoLayer.backgroundColor = [[UIColor greenColor] CGColor];

//set Timebase
CMTimebaseRef controlTimebase;
CMTimebaseCreateWithMasterClock( CFAllocatorGetDefault(), CMClockGetHostTimeClock(), &controlTimebase );

self.videoLayer.controlTimebase = controlTimebase;
CMTimebaseSetTime(self.videoLayer.controlTimebase, CMTimeMake(5, 1));
CMTimebaseSetRate(self.videoLayer.controlTimebase, 1.0);

// connecting the videolayer with the view

[[self layer] addSublayer:_videoLayer];

providing the video data to the layer

__block AVAssetReaderTrackOutput *outVideo = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:video outputSettings:dic];

if( [assetReaderVideo startReading] )
{
    [_videoLayer requestMediaDataWhenReadyOnQueue: assetQueue usingBlock: ^{
        while( [_videoLayer isReadyForMoreMediaData] )
        {
            CMSampleBufferRef *sampleVideo = [outVideo copyNextSampleBuffer];

            [_videoLayer enqueueSampleBuffer:sampleVideo.data];
        }
    }];
}

For further details: the Session 513 in WWDC 2014 is very informative.

Karsten
  • 1,869
  • 22
  • 38
  • This answer is missing a swath of code needed, as indicated by the next answer. As it is, this would not work for anyone. – James Bush Jul 30 '16 at 18:43
  • Not only is code missing, but it's wrong. There is no data property for CMSampleBuffer, so sampleBuffer.data makes no sense. – James Bush Aug 14 '16 at 21:53
  • The data is provided in the second snippet via the outVideo which is reading the video. You should watch the WWDC session 513/2014 first... – Karsten Aug 15 '16 at 07:01
  • You need an AVAssetReader to get the track (or AVPlayerItem); and, an AVAssetReader to read the video [AVAssetReader startReading]. I can tell he didn't do that at all, and that his code doesn't work, because he's using two different variables for the asset reader: video and assetReaderVideo. That won't work under any circumstance. – James Bush Aug 15 '16 at 09:26
2

I am attempting this but finding that there is no image on the AVSampleBufferDisplay Layer.

I create the NALUnits from a raw byte stream and pass the IDR and Non-IDR slices using:

if ([avLayer isReadyForMoreMediaData]) {
         [avLayer enqueueSampleBuffer:sampleBuffer];
}

There is no error status returned from EnqueueSampleBuffer so can be difficult to find out where it is going wrong.

Md1079
  • 1,360
  • 1
  • 10
  • 28
  • You better checkout the Session 513 from WWDC 2014 for the details. The problem ist to provide CMSampleBufferRef as expected. – Karsten Sep 26 '14 at 07:46
  • You can have a look at the source code i provided. [Take a look](http://stackoverflow.com/questions/25980070/how-to-use-avsamplebufferdisplaylayer-in-ios-8-for-rtp-h264-streams-with-gstream) – Zappel Oct 29 '14 at 18:47
  • I had some success with AVSampleBufferDisplayLayer, has anyone experienced any kind of jitter in playback (usually when an iFrame arrives)? I have experimented with buffers for smooth playback but not had too much success. I have also tried to use TimingInfo in the SampleBuffer to get it to play at the same point as the PresentationTimeStamp whilst setting 'ShouldDisplayImmediately' to false. – Md1079 Nov 07 '14 at 10:41
  • Maybe the frames in the stream are in different order than presentation order? – Lukasz Czerwinski Aug 22 '15 at 17:15
  • Jittery playback only occurs with multiple layers on single-core-processor devices, or when you use AVAssetReader to load the asset (it is not intended for real-time playback). If you load the asset with PHImageManager as either an AVPlayerItem or an AVAsset into an AVPlayer, and then redirect player output through AVPlayerItemVideoDataOutput, you can easily convert the pixel buffers to sample buffers. Run the method that performs these tasks on a separate thread from the main, and you'll have real-time playback performance for up to 16 layers. I have working sample code for this; just ask. – James Bush Jul 31 '16 at 04:41
  • No error means you're not validating the sample buffer (CMSampleBufferValidate). By the way, the source of your sample buffer is irrelevant; leave out those details unless creating sample buffers is the issue. It complicates the question unnecessarily, in that it sounds like you're asking for support for NALUnits or whatever the heck. That's a separate question. There are 50 gazillion methods for treating sample buffers; sometimes, you need to use them all. – James Bush Jul 31 '16 at 04:43
  • This jittering is a common problem: you must readjust the audio and video frame with delaying the display of some video frames. – Karsten Aug 01 '16 at 09:41
  • Error messages are not generated inside a block by default; use a try-catch statement to catch any exceptions. – James Bush Aug 14 '16 at 21:55