7

There are many examples online to use GStreamer pipeline with "tcpclientsink" or "udpsink" with NodeJS to consume the GStreamer pipeline output to Web Browser.

But I could not find any example or documentation which clearly explains how to use the webrtcbin element with a NodeJS server to send stream to a web browser. (An alternative to webrtcbin would be fine, too.)

I have the following GStreamer pipeline:

gst-launch-1.0 videotestsrc  \
! queue ! vp8enc ! rtpvp8pay \
! application/x-rtp,media=video,encoding-name=VP8,payload=96 \
! webrtcbin name=sendrecv

Can someone help in consuming this pipeline with a NodeJS based server to display the stream onto a web browser?

Here is a similar example, but it uses tcpclientsink: https://tewarid.github.io/2011/04/26/stream-live-webm-video-to-browser-using-node.js-and-gstreamer.html

Multisync
  • 767
  • 6
  • 25
Pawan Pillai
  • 1,955
  • 5
  • 37
  • 64

3 Answers3

10

Update: In the end, I was able to achieve GStreamer to Browser using NodeJS tutotial mentioned in the question. Here is a proof of concept code that someone can use if needed (or in case the tutorial link gets removed from internet):

var express = require('express')
var http = require('http')
var net = require('net');
var child = require('child_process');
require('log-timestamp');   //adds timestamp in console.log()

var app = express();
app.use(express.static(__dirname + '/'));

var httpServer = http.createServer(app);
const port = 9001;  //change port number is required

//send the html page which holds the video tag
app.get('/', function (req, res) {
    res.send('index.html');
});

//stop the connection
app.post('/stop', function (req, res) {
    console.log('Connection closed using /stop endpoint.');

    if (gstMuxer != undefined) {
        gstMuxer.kill();    //killing GStreamer Pipeline
        console.log(`After gstkill in connection`);
    }
    gstMuxer = undefined;
    res.end();
});

//send the video stream
app.get('/stream', function (req, res) {

    res.writeHead(200, {
        'Content-Type': 'video/webm',
    });

    var tcpServer = net.createServer(function (socket) {
        socket.on('data', function (data) {
            res.write(data);
        });
        socket.on('close', function (had_error) {
            console.log('Socket closed.');
            res.end();
        });
    });

    tcpServer.maxConnections = 1;

    tcpServer.listen(function () {
        console.log("Connection started.");
        if (gstMuxer == undefined) {
            console.log("inside gstMuxer == undefined");
            var cmd = 'gst-launch-1.0';
            var args = getGstPipelineArguments(this);
            var gstMuxer = child.spawn(cmd, args);

            gstMuxer.stderr.on('data', onSpawnError);
            gstMuxer.on('exit', onSpawnExit);

        }
        else {
            console.log("New GST pipeline rejected because gstMuxer != undefined.");
        }
    });
});

httpServer.listen(port);
console.log(`Camera Stream App listening at http://localhost:${port}`)

process.on('uncaughtException', function (err) {
    console.log(err);
});

//functions
function onSpawnError(data) {
    console.log(data.toString());
}

function onSpawnExit(code) {
    if (code != null) {
        console.log('GStreamer error, exit code ' + code);
    }
}

function getGstPipelineArguments(tcpServer) {
    //Replace 'videotestsrc', 'pattern=ball' with camera source in below GStreamer pipeline arguments.
    //Note: Every argument should be written in single quotes as done below
    var args =
        ['videotestsrc', 'pattern=ball',
            '!', 'video/x-raw,width=320,height=240,framerate=100/1',
            '!', 'vpuenc_h264', 'bitrate=2000',
            '!', 'mp4mux', 'fragment-duration=10',
            '!', 'tcpclientsink', 'host=localhost',
            'port=' + tcpServer.address().port];
    return args;
}

And also sharing the HTML code:

<!DOCTYPE html>

<head>
    <title>GStreamer with NodeJS Demo</title>
    <meta name="viewport" content="width=device-width, initial-scale=0.9">

    <style>
        html,
        body {
            overflow: hidden;
        }
    </style>
    
    <script>
        function buffer() {
            //Start playback as soon as possible to minimize latency at startup 
            var dStream = document.getElementById('vidStream');

            try {
                dStream.play();
            } catch (error) {
                console.log("Error in buffer() method.");
                console.log(error);
            }

        }
    </script>
</head>

<body onload="buffer();">
    <video id="vidStream" width="640" height="480" muted>
        <source src="/stream" type="video/mp4" />
        <source src="/stream" type="video/webm" />
        <source src="/stream" type="video/ogg" />
        <!-- fallback -->
        Your browser does not support the <code>video</code> element.
    </video>
</body>
Pawan Pillai
  • 1,955
  • 5
  • 37
  • 64
  • 3
    Doing video over TCP does have some downside, but glad it worked for you! WebRTC has congestion control + adaptive bitrate so will make sure you don't oversend video if you don't have enough bitrate available. Before GStreamer had webrtcbin I wrote https://github.com/pion/example-webrtc-applications/tree/master/gstreamer-send which could be helpful. – Sean DuBois Jun 09 '21 at 20:26
  • Hi @SeanDuBois, yes we faced some issues with the bitrate initially but we fine-tuned it. And due to varying networkState and readyState, GStreamer does crash sometimes. But we are able to achieve a balance and reload the stream if this happens. On a new browser window (like a new window popup), we have found the stream very stable. Crashes once after 60+ mins of streaming and then we automatically reload the page. So far, its running in an acceptable state. I will take a look at your implementation also. Thanks. – Pawan Pillai Jun 11 '21 at 13:44
  • what's the html page that the code refers to? When I get this running I just get a blank page with index.js when I load the site (sorry if this is obvious - im a noob with browser stuff). - nvm, it's at http://localhost:9001/stream – Edward Sep 07 '21 at 18:17
  • I've tried a bunch of sample html pages with video tags but none play the stream correctly, do you have an example? – Edward Sep 07 '21 at 19:02
  • @PawanPillai Where do you copy/save the html code? or what do you modify so node.js script can use that html. – jpvans Oct 12 '21 at 13:12
  • @jpvans the HTML file needs to be in the same folder as the javascript file above. If your Node JS is working properly and this JS code is executed, then the app.get("/"...) line in javascript will load the index.html when you start the web page. Please check basic nodejs tutorials to get started and it should make sense. – Pawan Pillai Oct 20 '21 at 18:31
  • I don't think this should be the accepted answer. The question specifically asked to use *webrtcbin* with a web browser, and this answer uses an alternative to webrtcbin. – Multisync Mar 10 '23 at 13:47
  • @Multisync I know the answer does not point towards webrtcbin but it solves the main requirement of running a GStreamer stream on to a browser. I have used this method to run a realtime GStreamer feed from a robot on to a browser with less than 200ms latency. My main purpose to provide this solution was to help someone else who is looking for a similar solution. Hope this helps. – Pawan Pillai Mar 10 '23 at 18:36
  • 1
    @PawanPillai, I editet the question to reflect that. – Multisync Mar 15 '23 at 13:14
  • This answer sends http content type `video/webm` for mp4. – Multisync Mar 15 '23 at 13:28
  • @PawanPillai does this technique still work? Do you stream to chrome? I'm unable to run the pipeline as it says `no element "vpuenc_h264"` Also h264 is not supported on chrome anymore, do you have something newer for it? – Márius Rak Jul 28 '23 at 23:01
  • @MáriusRak I am not actively maintaining that project now, so do not know the latest changes in Chrome. But till I was maintaining it, I was getting very high performance stream with under 200ms delay. – Pawan Pillai Aug 04 '23 at 15:09
3

Unfortunately it's not that simple. You have to have some way to interact with browser to be able exchange SDP offer/answer, and also ICE candidates exchange.

You can look example here

RSATom
  • 817
  • 6
  • 16
  • Yes, it does not look easy setup. I will take a look at the CPP code and let you know if it helps. – Pawan Pillai Sep 18 '20 at 12:58
  • any pointers on how to use that repo to get a MVP of a simple gstreamer pipeline streaming to a webpage? – Edward Sep 07 '21 at 18:47
  • @Edward you can look to complete app here https://github.com/WebRTSP/ReStreamer (binaries https://snapcraft.io/rtsp-to-webrtsp) Also you can find minimal example at https://github.com/WebRTSP/Native/tree/master/Apps/BasicServer, but it's a little bit broken right now (I'll fix it soon) – RSATom Sep 08 '21 at 02:22
  • @Edward I've prepared minimal working example https://github.com/WebRTSP/ClockServer If you will have any questions you can create issue or start discussion on GitHub – RSATom Sep 08 '21 at 03:51
3

There is a nice integration test for gstreamer (and other applications suchs as browsers) available here: https://github.com/sipsorcery/webrtc-echoes/tree/master/gstreamer. It works, with minimal quirks (at least in chrome). It gets data from this gstreamer pipeline

  pipeline =
     gst_parse_launch ("webrtcbin bundle-policy=max-bundle name=sendonly "
       "videotestsrc is-live=true pattern=ball ! videoconvert ! queue ! vp8enc deadline=1 ! rtpvp8pay ! "
       "queue ! " RTP_CAPS_VP8 " ! sendonly. "
       , &error);

and opens a web server for the browser to obtain this stream. You have to manually open the index.html