4

Most websocket examples I have seen use either mp4 or wbem container data. Here is some sample javascript client code:

var ms = new MediaSource();
...
var buf = ms.addSourceBuffer('video/mp4; codecs="avc1.64001E"');

In my case, my server sends raw h264 data (video only, no audio). As there is no mp4/avc container for my data, I am wondering what is the proper way to define the parameter for addSourceBuffer(). Do I simply omit video/mp4 tag as follows? Regards.

var buf = ms.addSourceBuffer('codecs="avc1.64001E"');
Peter
  • 11,260
  • 14
  • 78
  • 155

1 Answers1

12

I worked on a h264 play based on MediaSource several months ago. I didn't expect getting ups after such a long after the original answer, and I think I should edit this post to be more helpful. BTW I'm not a pro, this post is just based on my experience of using MediaSource API. Comments are welcome to correct me. Thanks!

var buf = ms.addSourceBuffer('video/mp4; codecs="avc1.64001E"');

After buf is created, I think buf expects fragmented MP4 data chunk each time when SourceBuffer.appendBuffer is called.

However you passed RAW H264 data to it which I think browser should give you an exception.

In my case, I used ffmpeg to read from a RTSP stream, convert the data to fMP4 format (without encoding) and send the output to stdout and then let other application to send the data to the browser. (I used WebSocket in fact.)

Here's the parameters:

ffmpeg -i rtsp://example.com/ -an -c:v copy -f mp4 \
       -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1

There's one more thing I want to share. I'm not sure how ffmpeg works, but it doesn't output a completed fragment each time I read from stdout. So in my backend program, I cached the data first. Here's pseudocode in Java:

byte[] oldbuf;
byte[] buffer = ReadDataFromFfmpegStdout();
if (buffer[4] == 'm' && buffer[5] == 'o' && buffer[6] == 'o' && buffer[7] == 'f') {
    send(oldbuf);            // the old buffer is a completed fragment now
    oldbuf = buffer;
} else {
    append(oldbuf, buffer);  // append data to the old buffer
}

[ORIGINAL ANSWER]

You may checkout this project 131/h264-live-player on GitHub, which is based on mbebenita/Broadway, a JavaScript H.264 decoder.

The example of node server-static.js streams a raw h264 video over WebSocket, and the client code render it in a canvas. Git clone that repo, follow the installation instruction, put you h264 file in the samples folder, modify video_path to your video file in server-static.js#L28, execute the node server-static.js and you will see the video played in your browser.

Please be aware that, Broadway can only work with baseline profile.

IronBlood
  • 165
  • 5
  • 3
    "Broadway can only work with baseline profile." - which makes that lib pretty useless – Michael IV Sep 30 '17 at 10:44
  • I'm trying to do the same thing as the OP, and I'm a bit lost with your pseudocode. Does `ReadDataFromFfmpegStdout()` read a fixed number of bytes? How is this size determined? – Guru Prasad Aug 26 '19 at 12:51
  • @GuruPrasad no, the number of bytes read from stdout of `ffmpeg` is not fixed. But it can tell you whether it's a new fragment by looking at the start, that's why I check whether `moof` is found. (Oh I made a mistake in the `if` block that I wrote `moov`, checkout this explanation of fMP4 https://stackoverflow.com/a/35180327/803378). In fact I have no idea how many bytes I should inialize, that was why I didn't write it in my pseudocode. In my project I made an assumption that each time the number of bytes read from `ffmpeg` is less than 5MB. – IronBlood Aug 27 '19 at 01:20
  • why not just send everything as is and let the mediasource buffer cache it instead of detecting moov manually? – Ivan Kara Feb 23 '22 at 11:44
  • @IvanKara Good question and in fact that was my original design, and it didn't work as I remembered. I didn't dig deep enough how things work, this solution was based on other online discussions and my practice. – IronBlood Feb 24 '22 at 12:11
  • found another implementation of websocket here (https://github.com/kmoskwiak/node-tcp-streaming-server/blob/master/server/app.js#L37). the hack there is to fix when a client connects not from the beginning of the stream. maybe this is your issue too? – Ivan Kara Feb 24 '22 at 14:21
  • @IvanKara That's one issue, so in my solution, each playback request needs an ffmpeg instance. I didn't reach the 1-to-many goal, as this approach was abandoned in the company I worked for, replaced with other hacks. One of the IPC vendors uses a C/S way, the browser send requests via websockets to a local service, then this service get stream data and render the video borderless, as if the video is part of the browser, and the browser don't decode at all, and also it's not a browser plugin. Back to your search result, I guess some common headers are needed for each new clients. – IronBlood Feb 25 '22 at 16:51
  • @IronBlood is there a way to know the codec without using mp4box? – Ivan Kara Jul 08 '22 at 14:17