20

I'm using Apple Video Toolbox framework to compress raw frames captured by the device camera.

My callback is being called with a CMSampleBufferRef object that contains CMBlockBuffer.

The CMBlockBuffer object contain the H264 elementary stream but I didn't find any way to get a pointer to the elementary stream.

When I printed into the console the CMSampleBufferRef object I got:

(lldb) po blockBufferRef
CMBlockBuffer 0x1701193e0 totalDataLength: 4264 retainCount: 1 allocator: 0x1957c2c80 subBlockCapacity: 2
[0] 4264 bytes @ offset 128 Buffer Reference:
CMBlockBuffer 0x170119350 totalDataLength: 4632 retainCount: 1 allocator: 0x1957c2c80 subBlockCapacity: 2
[0] 4632 bytes @ offset 0 Memory Block 0x10295c000, 4632 bytes (custom V=0 A=0x0 F=0x18498bb44 R=0x0)

It seems that the CMBlockBuffer object that I managed to get pointer to is contaning another CMBlockBuferRef (4632 bytes) which is not accessible.

Can anyone post how to access the H264 elemantry stream?

Thank you!

Hamid Yusifli
  • 9,688
  • 2
  • 24
  • 48
koby
  • 203
  • 2
  • 4

2 Answers2

56

I've been struggling with this myself for quite some time now, and have finally figured everything out.

The function CMBlockBufferGetDataPointer gives you access to all the data you need, but there are a few not very obvious things you need to do to convert it to an elementary stream.

AVCC vs Annex B format

The data in the CMBlockBuffer is stored in AVCC format, while elementary streams are typically following the Annex B specification (here is an excellent overview of the two formats). In the AVCC format, the 4 first bytes contains the length of the NAL unit (another word for H264 packet). You need to replace this header with the 4 byte start code: 0x00 0x00 0x00 0x01, which functions as a separator between NAL units in an Annex B elementary stream (the 3 byte version 0x00 0x00 0x01 works fine too).

Multiple NAL units in a single CMBlockBuffer

The next not very obvious thing is that a single CMBlockBuffer will sometimes contain multiple NAL units. Apple seems to add an additional NAL unit (SEI) containing metadata to every I-Frame NAL unit (also called IDR). This is probably why you are seeing multiple buffers in a single CMBlockBuffer object. However, the CMBlockBufferGetDataPointer function gives you a single pointer with access to all the data. That being said, the presence of multiple NAL units complicates the conversion of the AVCC headers. Now you actually have to read the length value contained in the AVCC header to find the next NAL unit, and continue converting headers until you have reached the end of the buffer.

Big-Endian vs Little-Endian

The next not very obvious thing is that the AVCC header is stored in Big-Endian format, and iOS is Little-Endian natively. So when you are reading the length value contained in an AVCC header pass it to the CFSwapInt32BigToHost function first.

SPS and PPS NAL units

The final not very obvious thing is that the data inside the CMBlockBuffer does not contain the parameter NAL units SPS and PPS, which contains configuration parameters for the decoder such as profile, level, resolution, frame rate. These are stored as metadata in the sample buffer's format description and can be accessed via the function CMVideoFormatDescriptionGetH264ParameterSetAtIndex. Note that you have to add the start codes to these NAL units before sending. The SPS and PPS NAL units does not have to be sent with every new frame. A decoder only needs to read them once, but it is common to resend them periodically, for example before every new I-frame NAL unit.

Code Example

Below is a code example taking all of these things into account.

static void videoFrameFinishedEncoding(void *outputCallbackRefCon,
                                       void *sourceFrameRefCon,
                                       OSStatus status,
                                       VTEncodeInfoFlags infoFlags,
                                       CMSampleBufferRef sampleBuffer) {
    // Check if there were any errors encoding
    if (status != noErr) {
        NSLog(@"Error encoding video, err=%lld", (int64_t)status);
        return;
    }

    // In this example we will use a NSMutableData object to store the
    // elementary stream.
    NSMutableData *elementaryStream = [NSMutableData data];


    // Find out if the sample buffer contains an I-Frame.
    // If so we will write the SPS and PPS NAL units to the elementary stream.
    BOOL isIFrame = NO;
    CFArrayRef attachmentsArray = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, 0);
    if (CFArrayGetCount(attachmentsArray)) {
        CFBooleanRef notSync;
        CFDictionaryRef dict = CFArrayGetValueAtIndex(attachmentsArray, 0);
        BOOL keyExists = CFDictionaryGetValueIfPresent(dict,
                                                       kCMSampleAttachmentKey_NotSync,
                                                       (const void **)&notSync);
        // An I-Frame is a sync frame
        isIFrame = !keyExists || !CFBooleanGetValue(notSync);
    }

    // This is the start code that we will write to
    // the elementary stream before every NAL unit
    static const size_t startCodeLength = 4;
    static const uint8_t startCode[] = {0x00, 0x00, 0x00, 0x01};

    // Write the SPS and PPS NAL units to the elementary stream before every I-Frame
    if (isIFrame) {
        CMFormatDescriptionRef description = CMSampleBufferGetFormatDescription(sampleBuffer);

        // Find out how many parameter sets there are
        size_t numberOfParameterSets;
        CMVideoFormatDescriptionGetH264ParameterSetAtIndex(description,
                                                           0, NULL, NULL,
                                                           &numberOfParameterSets,
                                                           NULL);

        // Write each parameter set to the elementary stream
        for (int i = 0; i < numberOfParameterSets; i++) {
            const uint8_t *parameterSetPointer;
            size_t parameterSetLength;
            CMVideoFormatDescriptionGetH264ParameterSetAtIndex(description,
                                                               i,
                                                               &parameterSetPointer,
                                                               &parameterSetLength,
                                                               NULL, NULL);

            // Write the parameter set to the elementary stream
            [elementaryStream appendBytes:startCode length:startCodeLength];
            [elementaryStream appendBytes:parameterSetPointer length:parameterSetLength];
        }
    }

    // Get a pointer to the raw AVCC NAL unit data in the sample buffer
    size_t blockBufferLength;
    uint8_t *bufferDataPointer = NULL;
    CMBlockBufferGetDataPointer(CMSampleBufferGetDataBuffer(sampleBuffer),
                                0,
                                NULL,
                                &blockBufferLength,
                                (char **)&bufferDataPointer);

    // Loop through all the NAL units in the block buffer
    // and write them to the elementary stream with
    // start codes instead of AVCC length headers
    size_t bufferOffset = 0;
    static const int AVCCHeaderLength = 4;
    while (bufferOffset < blockBufferLength - AVCCHeaderLength) {
        // Read the NAL unit length
        uint32_t NALUnitLength = 0;
        memcpy(&NALUnitLength, bufferDataPointer + bufferOffset, AVCCHeaderLength);
        // Convert the length value from Big-endian to Little-endian
        NALUnitLength = CFSwapInt32BigToHost(NALUnitLength);
        // Write start code to the elementary stream
        [elementaryStream appendBytes:startCode length:startCodeLength];
        // Write the NAL unit without the AVCC length header to the elementary stream
        [elementaryStream appendBytes:bufferDataPointer + bufferOffset + AVCCHeaderLength
                               length:NALUnitLength];
        // Move to the next NAL unit in the block buffer
        bufferOffset += AVCCHeaderLength + NALUnitLength;
    }
}   
Community
  • 1
  • 1
Anton Holmberg
  • 1,113
  • 12
  • 15
  • 1
    Hi Anton, you are correct ! i figured it by myself a day after i post my question. My main problem was to understand that the nalu size was stored as big endian, but after i took a watch on the content of the memory buffer - i figured it out and I was able to parse and copy the ES to my buffers. One thing that you i must add is AUD. In some cases, we will multiplex the ES together with an audio stream into mpeg-ts or any other container that will require AUD. I added AUD before every NALU and my decoder is now able to decode the stream. – koby Feb 14 '15 at 06:59
  • Quick question: You mentioned that NSMutableData was not a good way to hold ES data. Why is that, and what would you recommend using instead? – dcheng Jul 27 '15 at 18:53
  • The bad thing with NSMutableData that it keeps allocating space on the fly as bytes are appended. This is unnecessary work. It can be avoided by preallocating space for the data using the [NSMutableData dataWithCapacity:]. Another option is to have a buffer as instance variable that you reuse every time videoFrameFinishedEncoding is called. But all of these suggestions are minor optimizations that probably won't affect the performance much at all. – Anton Holmberg Jul 28 '15 at 19:47
  • Why Apple choose to separate one frame into multiple NALUs in CMBlockBuffer? Can I just merge these NALUs into one NALU? – ideawu Mar 14 '16 at 06:34
  • Hmm... I'm currently trying this, and Video Toolbox is giving length values in little-endian now. Is anyone else getting this? BTW I'm running it on an iPhone X. – Xavier L. Feb 11 '20 at 12:11
  • can I simply write this `elementaryStream` to a file? will it be playable? – prabhu Apr 02 '20 at 13:00
7

Thanks Anton for an excellent answer! Am putting a naive Swift-port of your solution for people interested in using the concepts discussed here straight in their Swift-based projects.

public func didEncodeFrame(frame: CMSampleBuffer)
{
    print ("Received encoded frame in delegate...")

    //----AVCC to Elem stream-----//
    var elementaryStream = NSMutableData()

    //1. check if CMBuffer had I-frame
    var isIFrame:Bool = false
    let attachmentsArray:CFArray = CMSampleBufferGetSampleAttachmentsArray(frame, false)!
    //check how many attachments
    if ( CFArrayGetCount(attachmentsArray) > 0 ) {
        let dict = CFArrayGetValueAtIndex(attachmentsArray, 0)
        let dictRef:CFDictionaryRef = unsafeBitCast(dict, CFDictionaryRef.self)
        //get value
        let value = CFDictionaryGetValue(dictRef, unsafeBitCast(kCMSampleAttachmentKey_NotSync, UnsafePointer<Void>.self))
        if ( value != nil ){
            print ("IFrame found...")
            isIFrame = true
        }
    }

    //2. define the start code
    let nStartCodeLength:size_t = 4
    let nStartCode:[UInt8] = [0x00, 0x00, 0x00, 0x01]

    //3. write the SPS and PPS before I-frame
    if ( isIFrame == true ){
        let description:CMFormatDescriptionRef = CMSampleBufferGetFormatDescription(frame)!
        //how many params
        var numParams:size_t = 0
        CMVideoFormatDescriptionGetH264ParameterSetAtIndex(description, 0, nil, nil, &numParams, nil)

        //write each param-set to elementary stream
        print("Write param to elementaryStream ", numParams)
        for i in 0..<numParams {
            var parameterSetPointer:UnsafePointer<UInt8> = nil
            var parameterSetLength:size_t = 0
            CMVideoFormatDescriptionGetH264ParameterSetAtIndex(description, i, &parameterSetPointer, &parameterSetLength, nil, nil)
            elementaryStream.appendBytes(nStartCode, length: nStartCodeLength)
            elementaryStream.appendBytes(parameterSetPointer, length: unsafeBitCast(parameterSetLength, Int.self))
        }
    }

    //4. Get a pointer to the raw AVCC NAL unit data in the sample buffer
    var blockBufferLength:size_t = 0
    var bufferDataPointer: UnsafeMutablePointer<Int8> = nil
    CMBlockBufferGetDataPointer(CMSampleBufferGetDataBuffer(frame)!, 0, nil, &blockBufferLength, &bufferDataPointer)
    print ("Block length = ", blockBufferLength)

    //5. Loop through all the NAL units in the block buffer
    var bufferOffset:size_t = 0
    let AVCCHeaderLength:Int = 4
    while (bufferOffset < (blockBufferLength - AVCCHeaderLength) ) {
        // Read the NAL unit length
        var NALUnitLength:UInt32 =  0
        memcpy(&NALUnitLength, bufferDataPointer + bufferOffset, AVCCHeaderLength)
        //Big-Endian to Little-Endian
        NALUnitLength = CFSwapInt32(NALUnitLength)
        if ( NALUnitLength > 0 ){
            print ( "NALUnitLen = ", NALUnitLength)
            // Write start code to the elementary stream
            elementaryStream.appendBytes(nStartCode, length: nStartCodeLength)
            // Write the NAL unit without the AVCC length header to the elementary stream
            elementaryStream.appendBytes(bufferDataPointer + bufferOffset + AVCCHeaderLength, length: Int(NALUnitLength))
            // Move to the next NAL unit in the block buffer
            bufferOffset += AVCCHeaderLength + size_t(NALUnitLength);
            print("Moving to next NALU...")
        }
    }
    print("Read completed...")
}
Shay
  • 123
  • 3
  • 6
  • 2
    @shary Can u post the decoding part also ?. – Sreejith S Jul 11 '17 at 10:27
  • Iam using this code in swift3 and not getting iFrame. let value = CFDictionaryGetValue(dictRef, unsafeBitCast(kCMSampleAttachmentKey_NotSync, to: UnsafeRawPointer.self)) Can you please help me out in this? Thanks in advance. – Ashish Aug 04 '17 at 12:35
  • For other archaeologists :) : CFDictionaryRef is called CFDictionary these days, and UnsafePointer is now UnsafeRawPointer. You will also need to add a to: label to the second argument of the unsafeBitCast calls. – sinewave440hz Aug 24 '21 at 16:33