4

I have an h.264 encoded stream of video on my server (node.js) and I want to use ffmpeg to convert it to an MP4 stream. Then I want to pipe that MP4 stream from the child process to the client using the response of an HTTP server that I have set up. I am very confused about all the options ffmpeg has and not sure how to pipe the output of the child process to the HTTP response.

I have tried several combinations of ffmpeg options but the video does not play in the browser (or show any sign of receiving data) and I don't know if I am converting it to MP4 corretly. I am also not sure I am piping it correctly. I am not getting any errors on either the server or client side.

HTML:

<video id="videoPlayer" poster="assets/logo_25.png" autoplay muted controls>
    <source src="http://localhost:3001" type="video/mp4">
</video>

Node Server:

const videoPORT = 3001;
var videoServer = http.createServer(function (req, res) {
    var videoIncommingPORT = 11111;
    var videoSocket = dgram.createSocket('udp4');

    videoSocket.on('listening', function () {
        var address = videoSocket.address();
    });

    videoSocket.on('message', function (message, remote) {
        var child_converter = spawn('ffmpeg', ['-fflags', '+genpts', '-r', '25', '-i', `${message}`, '-vcodec', 'libx264', '-f', 'mp4', 'copy', '-']);//convert h.264 to MP4 container

        child_converter.stdout.pipe(res);//pipe to response

        child_converter.stdout.on('data', (data) => {
            console.log(data);
        });
        child_converter.stderr.on('data', (data) => { 
            console.log(data); 
        });

    });
    videoSocket.bind(videoIncommingPORT);
});
videoServer.listen(videoPORT);
GSerg
  • 76,472
  • 17
  • 159
  • 346
Ricky
  • 259
  • 1
  • 9
  • `but the video does not play in the browser` - if the browser is even [remotely recent](https://caniuse.com/#search=h264), you don't need to re-encode h264. Just serve the file [as mp4](https://stackoverflow.com/questions/10477430/what-is-the-difference-between-h-264-video-and-mpeg-4-video#comment13538206_10477523). – GSerg Jun 01 '19 at 13:52
  • Mp4 cant really be piped. It’s not playable until the moov is written at the end of the file. So you have to wait until the ffmpeg process is done anyway. Its better is save it, them serve it. – szatmary Jun 01 '19 at 16:26
  • @GSerg I am trying to take the raw h.264 encoded video I have on the back end and convert to MP4 or anything that the client can read in the browser. The browser won't read h.264 i believe. – Ricky Jun 03 '19 at 13:49
  • @szatmary I need to live steam the video in real time. The video feed is from a drone and the client needs to see the video in real time. It is a steam I can not save it as a file and then server it up later. – Ricky Jun 03 '19 at 13:50
  • Then you can’t use mp4. – szatmary Jun 03 '19 at 13:52
  • Does anyone have an suggestions about how to approach this problem. I could try .ogg or .WebM streams as well since they are supported by the HTML5 – Ricky Jun 03 '19 at 14:00
  • HLS or DASH using fragmented mp4. You will need a player too. – szatmary Jun 03 '19 at 14:48
  • WebM is totally streamable this way, but you can't use H.264 with it. – Brad Jun 03 '19 at 14:51
  • @szarmary what do you mean by player? Something like jPlayer? http://jplayer.org/ – Ricky Jun 03 '19 at 15:21
  • @Ricky if you generate a fragmented stream then you should be able to feed it directly to a video tag, I've developed a similar solution previously. I did the fragmenting myself in C++ so not sure what ffmpeg flags you need, but that should be doable. Fragmented file will give you moov at the start and then fmp4 chunks that correspond to actual video data. If you are interrested I can give a link to the solution I developed to inspect how the stream looks. – Rudolfs Bundulis Jun 04 '19 at 13:59

0 Answers0