7

I want to use canvas element as mediastreamsource of video part of the webrtc communication, any directions would be helpful, scoured the net, not finding much resources discussing this topic

* Long Background Story *

The problem, I cannot send the video from camera directly, it is part of the requirements that I process the video(some image processing stuff, out of scope for this issue) before displaying.

Previously, on the other peer's browser, instead of directly displaying the video using a <video> tag, I did some processing on a hidden canvas element, and then copied the details to another canvas (I used a settimeout to keep drawing, which gave the illusion of live video).

Now, the client wants the processing done before transmission of video, so I used webrtc to pass the audio stream directly (previously both audio and video were sent via webrtc). For the video stream, I had two solutions:

Steps:

  1. Process video on local peer, draw on hidden canvas. the easy part.

  2. Use timeout to repeatedly capture image data and transmit
    a) using websockets( yes, goes through server), which came with a horrible lag and an eventual crash of the browser.
    b) using RTCDataChannel, which had much better performance, but at times fails for no reason. I also had several other issues (e.g. the extra bandwidth used, because of sending jpeg instead of webp).

Another major is issue is that because I am using timeout: when I switch tabs, the frame rate drops on the other side.

So, is there any way I can use the hidden canvas as mediastreamsource instead of me doing it manually?

BananaNeil
  • 10,322
  • 7
  • 46
  • 66
mido
  • 24,198
  • 15
  • 92
  • 117
  • 1
    Could you restrict use to Firefox? You MAY be able to jerry rig something with [mozCaptureStreamUntilEnded](https://github.com/muaz-khan/WebRTC-Experiment/tree/master/experimental/mozCaptureStreamUntilEnded). Or the only other way I can think of off the top of my head is to relay the media through a MCU of some sort(like the Janus-Gateway or Erizo from Licode) – Benjamin Trent Jan 08 '15 at 15:12
  • unfortunately, it needs to support both frefox and chrome, I thought of ```mozCaptureStreamUntilEnded``` route, but in firefox too, it would require pre-recorded media right? – mido Jan 09 '15 at 12:03
  • There may be something that could be hacked together, and would still use timeouts to capture canvases, which Jesup mentions in his answer. – Benjamin Trent Jan 09 '15 at 16:02

2 Answers2

3

mozCaptureStreamUntilEnded is going to be the basis for a proposal Martin Thompson is working on for the WG, to connect directly to a MediaStream. A workaround in Firefox per the comments here is mozCaptureStreamUntilEnded from a fed from a canvas captured from the MediaStream. An ugly sequence, which is part of why we're going to allow direct output of a to a MediaStream (and standardize captureStream on as well).

Note that feeding mozCaptureStream(UntilEnded) to a PeerConnection was broken for a while (partly since it's non-standard thus far); it's fixed in Firefox 36 (due on the release channel in 6 weeks; going to Beta next week). See Bug 1097224 and Bug 1081409

And incredibly hacky way on Chrome and Firefox would put the video in a window, then screencapture the window. I don't advise it since it requires screensharing permission, selecting the window, etc.

The only other option for Chrome (or Firefox) is to save frames of video as JPEGs (as you mention) and send over a DataChannel. Effectively Motion-JPEG, but run by JS. Framerate and quality (and delay) will suffer. You may want to use an unreliable channel since on errors you can throw the frame away and just decode the next one (it's MJPEG after all). Also, if delay gets too high, reduce the frame size! You'll want to estimate the end-to-end delay; best way is to feed back the decode time over datachannels to the sender and have it use the reception time of that packet to estimate delay. You care more about changes in delay than absolute values!!

jesup
  • 6,765
  • 27
  • 32
  • 1
    I am always glad when you address WebRTC questions as we know we are getting exact info :). – Benjamin Trent Jan 09 '15 at 16:03
  • @jesup, with datachannels I am facing two problems, one: if peer changes tab, the timeout calls are slower, so video slows down to 1-2 fps from 10 fps, and for some reason( unable to debug), after sometime, the channels( readyState still `open`) just stops transmitting data, the worst part is, there is no indication that says it stopped( timeout function still gets called, i have to look at other peer screen and figure out that it stopped)... – mido Jan 10 '15 at 03:06
  • @mido22 setTimeout calls in JS always reduce in frequency or stop (or reduce then stop) in all browsers when focus shifts to another tab. This is very intentional. You could try using a form of Worker, but I'm not sure that would work. You do get other events normally, so you may be able to get reverse-direction traffic to drive your captures (imperfect, but may work - have the client send "give me a frame" via setTimeouts, and grab frames when you receive these. Nice side effect: if the client stops watching the tab, the requests slow/stop. – jesup Jan 16 '15 at 07:27
  • yes, even I came to a similar conclusion, I was already using too many webworkers( for media recording purpose), so I tried using the other peer requesting for each frame, but no improvement if the tab is inactive :( – mido Jan 16 '15 at 07:33
  • incoming datachannel onmessage's or websocket onmessages should happen without delay, so you should be able to respond to those if your tab is inactive immediately. If not, please try a nightly with NSPR_LOG_MODULES=datachannel:5,timestamp NSPR_LOG_FILE=whatever – jesup Jan 20 '15 at 16:47
0

found a probable solution, at least for firefox, it is using canvas and capturing it's stream and transmitting it using canvas.captureStream()

// Find the canvas element to capture
var canvasElt = document.getElementsByTagName("canvas")[0];

// Get the stream
var stream = canvasElt.captureStream(25); // 25 FPS

// Do things to the stream
// E.g. Sent it to another computer using a RTCPeerConnection
//      pc is a RTCPeerConnection created elsewhere
pc.addStream(stream);
mido
  • 24,198
  • 15
  • 92
  • 117
  • Hello, I was trying same method ( capturing image from local video and and convert to to base64 and seperate it to chunks and send via data channel. )But as you mentioned it's slow. Does this new method perform visibly better ? I want this images look like video steaming on the other client. – Cozdemir Oct 02 '19 at 12:39
  • @mido: Hello there, can you please answer this question?https://stackoverflow.com/questions/62602079/windows-feed-a-webrtc-stream-to-a-virtual-driver – user2801184 Jun 26 '20 at 21:17