3

I am live-streaming H264 video over the network from another device and rendering it using AVSampleBufferDisplayLayer to render H264 frames.

Some more context: I've timed how long it takes to emit one full H264 frame and it's on average 14ms.

Since I want to render these frames in 'real-time', I do NOT setup a custom timebase nor do I provide SampleTimingInfo to each CMSampleBuffer. Instead I set the kCMSampleAttachmentKey_DisplayImmediately value to true.

let attachments:CFArray? = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, createIfNecessary: true)
if let attachmentArray = attachments {
    let dic = unsafeBitCast(CFArrayGetValueAtIndex(attachmentArray, 0), to: CFMutableDictionary.self)

    CFDictionarySetValue(dic,
                         Unmanaged.passUnretained(kCMSampleAttachmentKey_DisplayImmediately).toOpaque(),
                         Unmanaged.passUnretained(kCFBooleanTrue).toOpaque())
}

...and then I enqueue the frame into AVSampleBufferDisplayLayer like so:

if self.VideoLayer.isReadyForMoreMediaData
{
    self.VideoLayer?.enqueue(sampleBuffer)
    self.VideoLayer.setNeedsDisplay() // I've commented this line on/off
}
else
{
    print("not ready for more metadata")
}

This code does work really well and displays the video frames ASAP and I never hit the self.VideoLayer.isReadyForMoreMediaData == false case.

However, under certain conditions (reproducible every time), AVSampleBufferDisplayLayer hangs.

If I start my server application that is transmitting the H264 video and 'background' it, the server application runs a little slowly and transmits the TCP frames slowly. When I run my Swift app, it gives AVSampleBufferDisplayLayer 'warmup time' to stream the frames at a much lower FPS (something like 5-6fps).

Then if I wait for a few seconds and bring my server application into foreground, it will start transmitting the frames at full speed (14ms/60+fps transmit time). Then the AVSampleBufferDisplayLayer is able to now handle the faster frame input and handle the stream well and it keeps running fine.

But, if I do NOT background my server application at the start, and keep it in foreground, so it always transmits at full rate, as soon as my Swift app connects, AVSampleBufferDisplayLayer will render one or two frames and then completely freeze.

I have tried reading the following AVSampleBufferDisplayLayer values to check if something is wrong, but nothing shows a problem of any type. I have looked at:

AVSampleBufferDisplayLayer.error = nil,
AVSampleBufferDisplayLayer.isReadyForMoreMediaData = true,
AVSampleBufferDisplayLayer.requiresFlushToResumeDecoding = false,
AVSampleBufferDisplayLayer.status = AVQueuedSampleBufferRenderingStatus.rendering

They all show the expected values as if it was rendering correctly. VTDecompressionSessionDecodeFrame is still receiving new frames and decoding correctly (although I'm not using VTDecompressionSessionDecodeFrame's CVPixelBuffer, I just render the CMSampleBuffer.

My theory to this problem is, maybe the queue inside AVSampleBufferDisplayLayer at the beginning needs to build up in size, and if at the beginning I feed it frames too fast, some code somewhere errors out, but when I give it time to warm up and increase the feed frame rate 'slowly', it's able to build up its inner queue and process and render the frames.

However, this interesting part is the fact that AVSampleBufferDisplayLayer.isReadyForMoreMediaData is always true so the queue never fills up.

The other thing I can try to do is re-create the entire AVSampleBufferDisplayLayer if I can detect the AVSampleBufferDisplayLayer has frozen, but I cannot find any property that lets me know if it's currently frozen or not.

My questions are:

  1. Why does the AVSampleBufferDisplayLayer freeze / how can I fix it?
  2. If it's not an easy fix, how can I at least detect the freeze?
user3339439
  • 95
  • 1
  • 7

0 Answers0