0

I'm currently sending a canvas stream over WebRTC, using canvas.captureStream(). This works as intended, but my canvas is an overlay over another video and therefore has transparent pixels.

I'm aware that the usual H26x or VPx formats don't support transparency (see also Streaming video with transparent pixels using webrtc), so I decided to go with good old chroma-keying (i.e. 100% green == transparent).

To get there, I'm currently filling the canvas with transparent green at startup:

// already tried various composite ops here, none seem to work
// context.globalCompositeOperation = "destination-out";
context.fillStyle = "rgba(0,255,0,0)";
context.fillRect(0, 0, canvas.width, canvas.height);

This looks correct on the client side (the canvas is transparent, whatever I draw on top is not), but in the resulting stream that's going out over WebRTC, the background that's transparent in the browser is apparently just black, rather than green. I would expect the alpha value to just get dropped?

When I change the alpha value to anything other than 0 (e.g. 1 or 128), then the result isn't transparent anymore, but fully opaque bright green, both in the outgoing stream (good) and in the browser (not good).

I'd rather avoid having to manually do RGBA -> RGB conversion in Javascript for every frame on a hidden canvas, which is the only alternative I can think of right now. Other ideas very welcome :-)

EDIT: tested with both Chrome 96 and Firefox 94, on Ubuntu 20.04. For reference, here's the description of the compositing ops: https://developer.mozilla.org/en-US/docs/Web/API/Canvas_API/Tutorial/Compositing/Example

Florian Echtler
  • 2,148
  • 1
  • 15
  • 28
  • 1
    What kind of drawings will be on the canvas? There will be very different solution depending on the use case. For instance, if it's "just" a drawing board, then sending the drawing commands instead of a MediaStream would probably be better. If it's a "funny-hat" over a webcam feed, then a "video + mask at the bottom" solution would probably be better. But anyway, yes setting ctx.fillStyle = "rgba(255,255,255,0)" is exactly the same as setting it to "transparent" or "rgba(0,0,0,0)", alpha is premultiplied and any transparent pixel is transparent-black. – Kaiido Dec 09 '21 at 08:41
  • Right now, it's just mouse doodles, i.e. a drawing board. I was also considering to send drawing commands, but there are other non-web clients that will only send video, so the receiver also expects video only (I hope that makes sense). After I saw your other comment here (https://stackoverflow.com/questions/42146396/capture-video-with-alpha-channel-using-canvas-capturestream#comment98754395_42150255), I realized that I won't need to do pixel-wise conversion after all but can just paint the canvas image over a second green canvas and stream that one? – Florian Echtler Dec 09 '21 at 08:51
  • Yes you could though depending on the content a chroma key may not render well (particularly with antialising). A mask would mean more data being sent but better quality. However whatever the solution how would these non-web clients receivers consume the MediaStream? Would they be able to post-process the data correctly? FWIW, Chrome does support recording with transparency. – Kaiido Dec 09 '21 at 09:54
  • The non-web clients run the GStreamer WebRTC component on Python and basically just mix/overlay several video streams, so it would be a rather noticeable amount of work to have them parse drawing commands. I figured out an alternative solution that seems relatively efficient, see posted answer. – Florian Echtler Dec 10 '21 at 07:44

1 Answers1

1

Based on the comments by Kaiido (thanks!), I figured out a relatively (?) efficient solution: I now have two canvases, one visible and one hidden, and all drawing commands are just replicated on both of them. The visible one uses a transparent black background and is overlaid over the video, the hidden one has a bright green background and is used as stream source.

Florian Echtler
  • 2,148
  • 1
  • 15
  • 28