0

I have an Android device, which will send raw live H264 AnnexB NAL unit stream like [0,0,0,1,103,...][0,0,0,104,...][0,0,0,101,...][0,0,0,1,65,...][0,0,0,1,65,...] and try to mux them into flv container and send it to nginx-with rtmp module use libavformat of ffmpeg.

If I save the received live stream to a local file, say test.h264. I can mux it to server OK using ffmpeg command "ffmpeg -i test.h264 -f flv rtmp://my/server/url". But I don't known how to handle live stream.

I noticed ffmpeg/libavformat/avc.c have 2 functions that seem achieve my goal. But I'm not sure.

Here is the code of ffmpeg

int ff_avc_parse_nal_units(AVIOContext *pb, const uint8_t *buf_in, int size)
{
    const uint8_t *p = buf_in;
    const uint8_t *end = p + size;
    const uint8_t *nal_start, *nal_end;

    size = 0;
    nal_start = ff_avc_find_startcode(p, end);
    for (;;) {
        while (nal_start < end && !*(nal_start++));
        if (nal_start == end)
            break;

        nal_end = ff_avc_find_startcode(nal_start, end);
        avio_wb32(pb, nal_end - nal_start);
        avio_write(pb, nal_start, nal_end - nal_start);
        size += 4 + nal_end - nal_start;
        nal_start = nal_end;
    }
    return size;
}

int ff_avc_parse_nal_units_buf(const uint8_t *buf_in, uint8_t **buf, int *size)
{
    AVIOContext *pb;
    int ret = avio_open_dyn_buf(&pb);
    if(ret < 0)
        return ret;

    ff_avc_parse_nal_units(pb, buf_in, *size);

    av_freep(buf);
    *size = avio_close_dyn_buf(pb, buf);
    return 0;
}

Any useful reply are appreciated.

Thank you!

  • I assume you understand that one NALU is not the same thing as one frame, and that you have already worked out extratada/sequence header? And that you must convert from annex b. – szatmary May 18 '16 at 16:32
  • all of that is done if you invoke the H264 AVParser: https://ffmpeg.org/doxygen/trunk/group__lavc__parsing.html – Ronald S. Bultje May 18 '16 at 17:23
  • @szatmary I have noticed someone asked how to get extradata from H264 NAL unit somewhere, but I do not known why do that and what's next after this step. – Zhou Yufeng May 19 '16 at 02:18
  • @RonaldS.Bultje I searched the APIs of AVParser. It seems that they are used for decoding H264 unit. Can AVParser process NAL unit and output a AVPacket structure? – Zhou Yufeng May 19 '16 at 02:31
  • @szatmary Forgive me as a newbie in H264. I have read your answer about AnnexB and AVVC ["h264 annexb bitstream to flv mux ffmpeg library"](http://stackoverflow.com/questions/29751805/h264-annexb-bitstream-to-flv-mux-ffmpeg-library) again and known that I should grab the SPS and PPS NAL unit, then covert them to AVCC extradata and pass to Codec context extradata. Your answer is very helpful! But question still exists: How to handle those IDR and non-IDR NAL unit (type is 0x65 or ox41) My final target is mux those NAL units into FLV too. – Zhou Yufeng May 19 '16 at 08:38
  • The output of AVParser is a uint8_t *data and int size field which can together be used to initialize a AVPacket. See e.g. this code: https://ffmpeg.org/doxygen/trunk/libavformat_2utils_8c-source.html#l01289 – Ronald S. Bultje May 19 '16 at 10:56
  • @RonaldS.Bultje I read the example code (av_parser_parse2) and known the data and size can be used to init AVPacket. But how about other fields of AVPacket, like pts and dts. – Zhou Yufeng May 19 '16 at 14:51
  • annexb has no timestamps, so you'll have to make them up somehow. You could use VUI parameters in the SPS to read framerate, but that's not always present. – Ronald S. Bultje May 19 '16 at 15:29
  • @RonaldS.Bultje The framerate is set by myself. So I think I should set those field like pts, dts manually. – Zhou Yufeng May 19 '16 at 15:33
  • Right, if you know the framerate, you know the timestamps, and thus setting pts/dts manually should be trivial. – Ronald S. Bultje May 19 '16 at 15:45

0 Answers0