13

I have the h264 stream inAnnex B format and follow this link here to implements h264 decoding with iOS8 videoToolBox.

I check the OSStatus in every step.

  1. use CMVideoFormatDescriptionCreateFromH264ParameterSets with the SPS and PPS data to create a CMFormatDescription.(status == noErr)

  2. create a VTDecompressionSession using VTDecompressionSessionCreate. (status == noErr)

  3. capture the NALUnit payload into a CMBlockBuffer making sure to replace the start code with
    a byte length code. (status == noErr)

  4. create a CMSampleBuffer. (status == noErr)

  5. use VTDecompressionSessionDecodeFrame and get error code -8969(simulator), -12909(device) in callback function.

I doubt I did something wrong in step3, I am not pretty sure what the length code means. I just follow the WWDC session video replace every NALUnit start code 00 00 00 01 to 00 00 80 00. Is it right or not ? or I should check something else ?? thanks

Community
  • 1
  • 1
Alex Cheng
  • 829
  • 8
  • 20

1 Answers1

17

Finally, got it working now. So, I share the details on how to use the VideoToolbox to decode h.264 stream data:

  1. Get SPS & PPS NALUs from H.264 stream data (or SDP)
  2. Create CMFormatDescription by using CMVideoFormatDescriptionCreateFromH264ParameterSets.
  3. Create VTDecompressionSession by using VTDecompressionSessionCreate.
  4. Get NALUnit payload into a CMBlockBuffer.
  5. Replace the start code with a 4 byte length code. (ps: length = NALUnit length - start code length)
  6. Create a CMSampleBuffer by using CMSampleBufferCreate.
  7. Use VTDecompressionSessionDecodeFrame and get the result from callback.

then, you have to use dispatch_semaphore_t to control frame decoding and showing. I upload the sample project on my git. hope to help someone else.

DrMickeyLauer
  • 4,455
  • 3
  • 31
  • 67
Alex Cheng
  • 829
  • 8
  • 20
  • 2
    Thanks! Looking at your code made me realize that I was not byte-swapping the NALU length code. It has to be in big-endian format. – 12on Dec 12 '14 at 16:25
  • 2
    The link is broken. Do you happen to have the VideoToolboxDemo uploaded somewhere else. This is exactly what I am trying to do and I am struggling. – ddelnano Jan 16 '15 at 22:20
  • Can someone provide some code for the step 4 and 5 above? I'm stucked on it and can't find a solution... – FormigaNinja Mar 19 '15 at 03:25
  • Try something like this, sorry for bad formatting. int offset = _spsSize + _ppsSize; long blockLength = totalFrameSize - offset; uint8_t* data = malloc(blockLength); data = memcpy(data, &frame.data[offset], blockLength); uint32_t dataLength32 = htonl (blockLength - 4); memcpy (data, &dataLength32, sizeof (uint32_t)); OSStatus status = CMBlockBufferCreateWithMemoryBlock(NULL, data, blockLength, kCFAllocatorNull, NULL, 0, blockLength, 0, &blockBuffer); NSLog(@"BlockBufferCreation: \t %@", (status == kCMBlockBufferNoErr) ? @"successful!" : @"failed..."); – Olivia Stork Apr 07 '15 at 15:24
  • 1
    @LivyStork what type of object is "frame"? It seems more code is needed to interpret your comment and more information would be greatly appreciated. – 3rdLion Apr 08 '15 at 01:14
  • 1
    @3rdLion Well you only asked about steps 4 and 5 :) The data frame should be an entire NALU that you received from a raw H.264 stream. Whether or not it's a P or I frame depends on if there are SPS and PPS parameters involved. Sorry for not going into more specifics, I can only give so much info in a stack overflow comment. I'll reply tomorrow and try to help out more. – Olivia Stork Apr 08 '15 at 03:15
  • 1
    @LivyStork a complete example would be super helpful. The issue is that this new iOS8 decoding isn't documented at a level for practical application. Getting rather desperate in my efforts to make it work. Maybe throw something up on Github? or contact me directly :p – 3rdLion Apr 08 '15 at 03:36
  • 1
    @3rdLion I agree there aren't many complete examples online. It took me a few weeks to learn everything ground up, so maybe I can help by posting an example on stackoverflow. I'll try to get that done today and will comment here again when I'm done. – Olivia Stork Apr 08 '15 at 16:15
  • @3rdLion I have posted a pretty long example [here](http://stackoverflow.com/questions/29525000/how-to-use-videotoolbox-to-decompress-h-264-video-stream/). Feel free to ask questions there. I didn't cover how to fetch the raw H.264 video data since that varies wildly depending on the situation, but I covered all the other steps. – Olivia Stork Apr 08 '15 at 20:46