I used the following stackoverflow posts to extract h264 parameters , so I can send an elementary stream over the network and the receiving application will reconstruct CMSampleBuffer. However, no video image was displayed on AVSampleBufferDisplayLayer.
I took a step back and extracted the NAL units and reconstructed the CMSampleBuffer internally in the device. Then I discovered that I need to create a CMTimingInfo object and include it as one of the arguments for the method CMSampleBufferCreate. Then when I queued the newly created CMSampleBuffer to an AVSampleBufferDisplayLayer, the video image is displayed.
So I think I should integrate the CMTimingInfo in the elementary stream and sent that over the network. However, I have watched WWDC 2014 - Session 513 and they said nothing about including CMTimingInfo into the elementary stream.
Thank you in advance for responding to my question.