11

According to MDN:

The HTMLMediaElement interface adds to HTMLElement the properties and methods needed to support basic media-related capabilities that are common to audio and video.

HTMLMediaElement.captureStream(). It can be used with both <video> and <canvas> elements to capture their stream.

Conversely, one can add a video stream as srcObject to a <video> element, then it shows it. Is it possible for <canvas> element too?

Is it possible to add a stream as source to an html <canvas> element?

sçuçu
  • 2,960
  • 2
  • 33
  • 60
  • Possibly a repeated usage of ´ctx.drawImage(video, ...)´ does this but is there a stream based abstraction that mirrors the html ´video´ element's ´captureStream´ method – sçuçu May 11 '19 at 23:15

2 Answers2

10

No there is nothing in any of the Canvas APIs able to consume a MediaStream.

The canvas APIs work only with raw pixels, and contain no decoder of any sort. You must use either either javascript objects that are able to do this decode (e.g ImageBitmap), or HTMLElements.

So in the case of a MediaStream, currently the only object able to decode it's video content will be an HTMLVideoElement, that you'll be able to draw on your canvas easily.


2021 update

The WebCodecs API has made great progress recently, and is becoming more and more mature that it's now worth being mentioned as a solution.

This API offers a new interface called VideoFrame which will soon be part of the CanvasImageSources type, meaning, we can use it directly with drawImage, texImage2D and everywhere such a CanvasImageSource can be used.
The MediaCapture Transform W3C group has developed a MediaStreamTrackProcessor that does return such VideoFrames from a video MediaStreamTrack.

So we now have a more direct way to render a MediaStream to a canvas, which currently only works in Chrome browser with the #enable-experimental-web-platform-features flag on...

if( window.MediaStreamTrackProcessor ) {
  const canvas = document.querySelector("canvas");
  const ctx = canvas.getContext("2d");
  const track = getCanvasTrack(); // MediaStream.getVideoTracks()[0]
  const processor = new MediaStreamTrackProcessor( track );
  const reader = processor.readable.getReader();
  readChunk();
  function readChunk() {
    reader.read().then( ({ done, value }) => {
      // the MediaStream video can have dynamic size
      if( canvas.width !== value.displayWidth || canvas.height !== value.displayHeight ) {
        canvas.width = value.displayWidth;
        canvas.height = value.displayHeight;
      }
      ctx.clearRect( 0, 0, canvas.width, canvas.height );
      // value is a VideoFrame
      ctx.drawImage( value, 0, 0 );
      value.close(); // close the VideoFrame when we're done with it
      if( !done ) {
        readChunk();
      }
    });
  }
}
else {
  console.error("Your browser doesn't support this API yet");
}

// We can't use getUserMedia in StackSnippets
// So here we use a simple canvas as source
// for our MediaStream.
function getCanvasTrack() {
  // just some noise...
  const canvas = document.createElement("canvas");
  const ctx = canvas.getContext("2d");
  const img = new ImageData(300, 150);
  const data = new Uint32Array(img.data.buffer);
  const track = canvas.captureStream().getVideoTracks()[0];

  anim();
  
  return track;
  
  function anim() {
    for( let i=0; i<data.length;i++ ) {
      data[i] = Math.random() * 0xFFFFFF + 0xFF000000;
    }
    ctx.putImageData(img, 0, 0);
    if( track.readyState === "live" ) {
      requestAnimationFrame(anim);
    }
  }
  
}
<canvas></canvas>

As a glitch project (source) using the camera as source.

Kaiido
  • 123,334
  • 13
  • 219
  • 285
  • Do you mean that I must use use either javascript objects or HTMLElements that can decode from streams (from MediaStream) – sçuçu May 13 '19 at 18:21
  • 1
    Yes you must use an HTMLVideoElement to decode the video component of a MediaStream on front-end. Only from this you'll be able to use one of the canvas contexts method to draw it (e.g drawImage for 2d context) – Kaiido May 13 '19 at 23:35
  • My another related post was about *showing both camera stream and canvas 2d or webgl drawn things in a `video` or `canvas` element, using `MediaStream`*. It turned out wrong approach. Related to that, is that possible if a use `canvas ` context's `drawImage()` to put camera video on `canvas` overlaying it with other virtual objects created with canvas 2d or webgl? – sçuçu May 14 '19 at 06:43
  • Yes, just check [the link](https://stackoverflow.com/questions/4429440/html5-display-video-inside-canvas) I posted in my answer. – Kaiido May 14 '19 at 06:50
  • I couldn't find your username there, is it different there and is it the accepted answer? – sçuçu May 14 '19 at 06:58
  • No, I didn't answer there but the one answers that have been posted all show precisely how to draw a video on a canvas. That it comes from a MediaStream is no different: you need to get it playing, then use ctx.drawImage(video,x,y) – Kaiido May 14 '19 at 07:01
  • Ah sorry, I misunderstood your comment. I saw the asnwer. Thanks, now I understand. I hope same thing is possible when canvas webgl context is used as well. Though the API of webgl context will be used whereever possible. I just need it to be possbbile, I will figure out if I cannot I will search then ask again. – sçuçu May 14 '19 at 07:13
  • Yes, you can also very well pass an HTMLVideo as a texture in webgl. There are many examples out there. – Kaiido May 14 '19 at 07:26
3

@Kaiido is correct in that there isn't any way to do this directly. So, here's what you must do:

function onFrame() {
  window.requestAnimationFrame(onFrame);
  canvasContext.drawImage(video, 0, 0);
}
onFrame();

A couple gotchas you're going to run into:

  • Your source video can change resolution mid-stream. This is very common in WebRTC calls where the source may scale the actual pixel resolution due to bandwidth or CPU constraints. One way around this is to check the size of the video every single frame you draw, and scale accordingly on your canvas.
  • This frame loop doesn't run at speed when the tab doesn't have focus. If you're relying on captureStream from this canvas as well, due to throttling policies, it isn't going to work if the tab doesn't have focus.
  • The canvas buffer doesn't update when the tab doesn't have focus, so even if you hack around the timer issue with an audio script node or something, it won't work if you want to use captureStream from the canvas as well.
  • Remember that there is no "genlock" here. For every frame you copy to the canvas, an arbitrary number of frames (possibly zero!) could have passed by on the video. This might not matter for your situation.
Brad
  • 159,648
  • 54
  • 349
  • 530
  • 1
    Good point about the fact the source resolution may change, however, there is an `onresize` event that you can listen on the consumer instead of checking every frame: https://jsfiddle.net/df83jbyx/ Also, even if they are good points, I don't think OP will use `captureStream`, if got it correctly, they only talked about it to state that MediaElement and CanvasElement APIs have relations with MediaStream, and thus hoped that they would be able to consume it directly. – Kaiido May 13 '19 at 00:02
  • Aggreed. Good cotchas. I would suggest for the first one, about video resolution changes usually in webrtc, instead of checking or setting resolution repeatedly on every draw `negotiationneeded` event on webrtc peerconnection might be better. For other cases involving other objects I would look for if video element or stream track has an event for resolution or some changes to make the whole thing independent of the source related objects, like webrtc above, used like @Kaiido's suggestion of `resize` event. – sçuçu Nov 29 '21 at 09:43