1

I'm serving this fragmented mp4 file which was created by ffmpeg.

Here is the client side code:

<!DOCTYPE html>
<html>
    <head>
        <title>WebSocket and MSE demo</title>
    </head>

    <body>
        <video id="livestream" width="640" height="480" autoplay />
    </body>

    <script>
            //var verbose = false;
            var verbose = true;

            // set mimetype and codec
            var mimeType = "video/mp4";
            var codecPars = mimeType;

            var streamingStarted = false; // is the sourceBuffer updateend callback active nor not

            // create media source instance
            var ms = new MediaSource();

            // queue for incoming media packets
            var queue = [];

            var ws; // websocket

            // ** two callbacks ** 
            // - pushPacket : called when websocket receives data
            // - loadPacket : called when sourceBuffer is ready for more data
            // Both operate on a common fifo

            function pushPacket(arr) { // receives ArrayBuffer.  Called when websocket gets more data
                // first packet ever to arrive: write directly to sourceBuffer
                // sourceBuffer ready to accept: write directly to SourceBuffer
                // otherwise insert to fifo

                var view   = new Uint8Array(arr);
                if (verbose) { console.log("got", arr.byteLength, "bytes.  Values=", view[0], view[1], view[2], view[3], view[4]); }

                data = arr;

                if (!streamingStarted && !sourceBuffer.updating) {
                    if (verbose) console.log("Streaming started with", view[0], view[1], view[2], view[3], view[4]);
                    sourceBuffer.appendBuffer(data);
                    streamingStarted=true;
                    return;
                }

                queue.push(data); // add to the end
                if (verbose) { console.log("queue push:", queue.length); }               
            }

            function loadPacket() { // called when sourceBuffer is ready for more
               if (!sourceBuffer.updating) { // really, really ready
                    if (queue.length>0) {

                        inp = queue.shift(); // pop from the beginning
                        if (verbose) { console.log("queue PULL:", queue.length); }

                        var view = new Uint8Array(inp);

                        if (verbose) { console.log("                        writing buffer with", view[0], view[1], view[2], view[3], view[4]); }

                        sourceBuffer.appendBuffer(inp);
                        }
                    else { // the queue runs empty, so we must force-feed the next packet
                        streamingStarted = false;
                    }


                }
                else { // so it was not?
                }
            }



            function opened() { // now the MediaSource object is ready to go
                sourceBuffer = ms.addSourceBuffer(codecPars);
                sourceBuffer.addEventListener("updateend",loadPacket);

                // set up the websocket
                ws = new WebSocket("ws://videoserver");
                ws.binaryType = "arraybuffer";
                ws.onmessage = function (event) {
                    pushPacket(event.data);
                };
            }


            function startup() {
                // add event listeners
                ms.addEventListener('sourceopen',opened,false);             

                // get reference to video
                var livestream = document.getElementById('livestream');

                // set mediasource as source of video
                livestream.src = window.URL.createObjectURL(ms);
            }


            window.onload = function() {
                startup();
            }
        </script>   
</html>

Video file is server over websocket.

The solution works in (under Windows 10 at least):

  • Chrome browser Version 80.0.3987.116 (Official Build) (64-bit)
  • Microsoft Edge 44.18362.449.0
  • Internet Explorer 11.657.18362.0
  • Opera 66.0.3515.103

However in does not work in:

  • Firefox 73.0.1 (64-bit) (under Windows 10)
  • Firefox 73.0 (64-bit) under Ubuntu 18.04

The video is not playing in Firefox browser and the browser claims Media resource blob:http://abcdef could not be decoded..

How can I debug Firefox to see what is the problem?

Stefan Falk
  • 23,898
  • 50
  • 191
  • 378
Daniel
  • 2,318
  • 2
  • 22
  • 53
  • Have you been able to solve this? Having similar issues: https://stackoverflow.com/questions/64433422/how-to-append-fmp4-chunks-to-sourcebuffer – Stefan Falk Oct 19 '20 at 18:53
  • Yes, the mp4 was fragmented, but some IDs in the fragments were incorrect. Honestly, I don't remember the specifics, but it was some strict firefox rule which the file violated. – Daniel Oct 20 '20 at 14:15
  • How are you chunking the data on your server? You append `data` with `sourceBuffer.appendBuffer(data)` but is it necessary to chunk fMP4 at specific offsets or something? – Stefan Falk Oct 20 '20 at 16:23
  • when ffmpeg or libavformat creates fmp4, it's already chunked (fragmented). But this topic is very complex, it depends on a lot of factors. Where does your server gets the source from? Livestream? File? What software do you use on your server? – Daniel Oct 21 '20 at 12:40
  • The file is uploaded to a cloud storage. With MP3 I did the following: The client seeks to e.g. 1 Minute, I compute an offset in the file and start reading the bytes of the file from there and send a chunk to the client e.g. 10 seconds of that MP3. Before the client comes to the end, it just requests the next chunk and so on. But apparently this approach does not work with an fMP4. The reason why I do all this: The track is supposed to start playing quickly and the user should not have to wait until he downloaded a 100MB+ file. – Stefan Falk Oct 21 '20 at 15:03
  • fmp4 this approach can work, but you'll have to send some initialization `box`-es, like `ftyp` and `moov`, and then you can continue with the fragments in the middle. But you might also need to set the timestamps and IDs in the fragments, as not all browsers like fragments with "other-than-0" start IDs. So to conclude you can't do this without some mp4 stream software. Hard way: you can start sending the file from an offset, but then you'll have to deal with the ID mods on the client side, and it's a real pain. Much easier to use a server side "server", like `ffmpeg`. – Daniel Oct 21 '20 at 15:29
  • I kind of feel this entire "play audio on the web"-thing is a big joke. The further I go the more insane it gets. Wtf is going on here, all I want is play a stupid audio track.. All this crap just because Firefox can't just play MP3 like Chrome? I'm sorry that I'm losing it right now but I'm so tired of this. It's a never-ending story and everything worked fine until I realized that Fireox just does not play a simple MP3 file .. – Stefan Falk Oct 24 '20 at 08:54
  • I don't know if I want another complicatoin. I have no idea how to use `ffmpeg` as a server on my backend and I am very certain that there be a million issues ahead. There just has to be a simpler way than doing all this stuff just .. just because we can't simply play MP3 +facepalm+ – Stefan Falk Oct 24 '20 at 08:56

0 Answers0