I'm currently working on the following:
On one computer, I have a browser with a white canvas, where you can draw in. On many other computers, you should be able to receive that canvas as a video stream. Plan would be to somehow convert the canvas surface to a video stream and send it via udp to other computers.
What I achieved so far is, that the canvas is redrawed on other computers with node.js and socket.io (so I basically just send the drawing information, like the coordinates). Then I also use the WebRTC's captureStream()
-method, to convert the canvas surface to a video tag. So "visually", its working, I draw on one computer, and on other computers, I can just set the video as fullscreen and it seems to be working.
But thats not yet what I want and need. I need it as a real video stream, so like receiving it with MPV then. So the question is: How can I send the canvas surface as a UDP live video stream? Propably I would also need to send it through FFMPEG or something to transcode it..
I read a lot so far, but basically didn't completely figure out what to do...
I had a look at the MediaStream you get back from captureStream(), but that doesn't seem to help a lot, as getTracks()
isn't working when capturing from a canvas.
Also, when talking about WebRTC, I'm not sure if its working, isn't it 2P2? Or can I somehow broadcast it and send packets to a UDP adress? What I read here is that it is not directly possible. But even if, what should I send then? So how can I send the canvas surface as a video?
So there's basically two question: 1. What would I have to send, how can I get the canvas to a video stream and 2. How can I send it as a stream to other clients?
Any approaches or tips are welcome.