I want to use canvas element as mediastreamsource of video part of the webrtc communication, any directions would be helpful, scoured the net, not finding much resources discussing this topic
* Long Background Story *
The problem, I cannot send the video from camera directly, it is part of the requirements that I process the video(some image processing stuff, out of scope for this issue) before displaying.
Previously, on the other peer's browser, instead of directly displaying the video using a <video>
tag, I did some processing on a hidden canvas element, and then copied the details to another canvas (I used a settimeout to keep drawing, which gave the illusion of live video
).
Now, the client wants the processing done before transmission of video, so I used webrtc to pass the audio stream directly (previously both audio and video were sent via webrtc). For the video stream, I had two solutions:
Steps:
Process video on local peer, draw on hidden canvas. the easy part.
Use timeout to repeatedly capture image data and transmit
a) usingwebsockets( yes, goes through server)
, which came with a horrible lag and an eventual crash of the browser.
b) usingRTCDataChannel
, which had much better performance, but at times fails for no reason. I also had several other issues (e.g. the extra bandwidth used, because of sending jpeg instead of webp).
Another major is issue is that because I am using timeout: when I switch tabs, the frame rate drops on the other side.
So, is there any way I can use the hidden canvas as mediastreamsource instead of me doing it manually?