0

I'm currently working on the following:

On one computer, I have a browser with a white canvas, where you can draw in. On many other computers, you should be able to receive that canvas as a video stream. Plan would be to somehow convert the canvas surface to a video stream and send it via udp to other computers.

What I achieved so far is, that the canvas is redrawed on other computers with node.js and socket.io (so I basically just send the drawing information, like the coordinates). Then I also use the WebRTC's captureStream()-method, to convert the canvas surface to a video tag. So "visually", its working, I draw on one computer, and on other computers, I can just set the video as fullscreen and it seems to be working.

But thats not yet what I want and need. I need it as a real video stream, so like receiving it with MPV then. So the question is: How can I send the canvas surface as a UDP live video stream? Propably I would also need to send it through FFMPEG or something to transcode it..

I read a lot so far, but basically didn't completely figure out what to do...

I had a look at the MediaStream you get back from captureStream(), but that doesn't seem to help a lot, as getTracks() isn't working when capturing from a canvas.

Also, when talking about WebRTC, I'm not sure if its working, isn't it 2P2? Or can I somehow broadcast it and send packets to a UDP adress? What I read here is that it is not directly possible. But even if, what should I send then? So how can I send the canvas surface as a video?

So there's basically two question: 1. What would I have to send, how can I get the canvas to a video stream and 2. How can I send it as a stream to other clients?

Any approaches or tips are welcome.

nameless
  • 1,483
  • 5
  • 32
  • 78
  • You said that `getTracks()` wasn't working, but why do you need a `track` specifically? You can send the entire stream and access the [`canvas`](https://developer.mozilla.org/en-US/docs/Web/API/CanvasCaptureMediaStream) property on the other end, correct? Assuming it's supported, of course. – Patrick Roberts Jul 10 '17 at 07:26
  • @PatrickRoberts Its correct, that I can access the canvas property on the other side yes, So on the other side I have the `CanvasCaptureMediaStream` with its canvas, id, active, onaddTrack and currentTime, as well as the `stream` variable, which puts the content into the video tag. But for me, its not a real "stream", or yes, the question is: how do I get it into my backend? How can I really stream it to a UDP adress for example then? – nameless Jul 10 '17 at 07:37
  • WebRTC is P2P because it doesn't have a "backend" aside from the initial signaling process. Since JavaScript has no support for UDP other than via WebRTC, which its API is very tightly controlled to prevent arbitrary UDP access, this is not possible. See [this answer](https://stackoverflow.com/a/13478490/1541563) for a better explanation. – Patrick Roberts Jul 10 '17 at 07:39
  • @PatrickRoberts So you think this won't work at all? I'm quite sure, that somehow it has to be possible, every problem has a solution, the question is only which one... – nameless Jul 10 '17 at 07:42
  • Possible duplicate of [How to send a UDP Packet with Web RTC - Javascript?](https://stackoverflow.com/questions/13216785/how-to-send-a-udp-packet-with-web-rtc-javascript) – Patrick Roberts Jul 10 '17 at 07:42
  • @PatrickRoberts Yes, thats the question I already pointed at in my thread as well. But even though I would be ablel to send "UDP Packets" or manage something similar, I'm still not sure what I should send, so how I can get the canvas as a real video stream – nameless Jul 10 '17 at 07:44
  • The `CanvasCaptureMediaStream` _is_ a real video stream, I don't understand your question. – Patrick Roberts Jul 10 '17 at 07:45
  • @PatrickRoberts is it? If yes, then I see it right, that if I would manage to send this "variable" to a UDP adress, I could just receive it with MPV for example? – nameless Jul 10 '17 at 07:46
  • You must implement a signaling server that connects the two endpoints before the UDP streaming can begin, this is how WebRTC works. [This looks promising](https://github.com/andyet/signalmaster) since you already mentioned getting a socket.io server working, this should be fairly easy to implement into your project. – Patrick Roberts Jul 10 '17 at 07:49
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/148756/discussion-between-nameless-and-patrick-roberts). – nameless Jul 10 '17 at 07:52
  • @PatrickRoberts Just to help me understand it right. The `canvasCaptureMediaStream` is a "real" video stream? To send it to another client, I need a signaling server for connecting the endpoints and then I can send the MediaStream via UDP? – nameless Jul 10 '17 at 07:54

1 Answers1

1

The timetocode.org site is an example of streaming from an HTML5 canvas (on the host computer) to a video element (on a client computer).

There's help in the "More on the demos" link on the main page. Read the topic on the multiplayer stuff there. But basically you just check the "Multiplayer" option, name a "room", connect to that room (that makes you the host of that room), follow one of links to the client page, then connect the client to the room that you set up. You should shortly see the canvas video streaming out to the client.

It uses socket.io for signaling in establishing WebRTC (P2P) connections. Note that the client side sends mouse and keyboard data back to the host via a WebRTC datachannel.

Key parts of the host-side code for the video stream are the captureStream method of the canvas element,

var hostCanvas = document.getElementById('hostCanvas');
videoStream = hostCanvas.captureStream(); //60

and the addTrack method of the WebRTC peer connection object,

pc.addTrack( videoStream.getVideoTracks()[0], videoStream);

and on the client-side code, the ontrack handler that directs the stream to the srcObject of the video element:

pc.ontrack = function (evt) {
   videoMirror.srcObject = evt.streams[0];
};
J Miller
  • 11
  • 3