7

I am getting frames from canvas through canvas.getDataURL().

However, now I have an array of png images, but I want a video file.

How do I do this?

var canvas = document.getElementById("mycanvaselementforvideocapturing");
var pngimages = [];
...
setInterval(function(){pngimages.push(canvas.toDataURL())}, 1000);
jib
  • 40,579
  • 17
  • 100
  • 158
David Callanan
  • 5,601
  • 7
  • 63
  • 105

2 Answers2

14

For a full browser support way, you'll have to send your image batch to the server then use some server-side program to do the encoding.

FFmpeg might be able to do it.

But in newest browsers the canvas.captureStream method, has been implemented. It will convert your canvas drawings to a webm video stream, recordable with a MediaRecorder. All of this is still not stabilized though, and will only be available in latest version of browsers, probably with some flags set in user's preferences (e.g chrome needs the "Experimental Web Platforms" one).

var cStream,
  recorder,
  chunks = [];

rec.onclick = function() {
  this.textContent = 'stop recording';
  // set the framerate to 30FPS
  var cStream = canvas.captureStream(30);
  // create a recorder fed with our canvas' stream
  recorder = new MediaRecorder(cStream);
  // start it
  recorder.start();
  // save the chunks
  recorder.ondataavailable = saveChunks;

  recorder.onstop = exportStream;
  // change our button's function
  this.onclick = stopRecording;
};

function saveChunks(e) {

  chunks.push(e.data);

}

function stopRecording() {

  recorder.stop();

}


function exportStream(e) {
  // combine all our chunks in one blob
  var blob = new Blob(chunks)
    // do something with this blob
  var vidURL = URL.createObjectURL(blob);
  var vid = document.createElement('video');
  vid.controls = true;
  vid.src = vidURL;
  vid.onended = function() {
    URL.revokeObjectURL(vidURL);
  }
  document.body.insertBefore(vid, canvas);
}

// make something move on the canvas
var x = 0;
var ctx = canvas.getContext('2d');

var anim = function() {
  x = (x + 2) % (canvas.width + 100);
  // there is no transparency in webm,
  // so we need to set a background otherwise every transparent pixel will become opaque black
  ctx.fillStyle = 'ivory';
  ctx.fillRect(0, 0, canvas.width, canvas.height);
  ctx.fillStyle = 'black';
  ctx.fillRect(x - 50, 20, 50, 50)
  requestAnimationFrame(anim);
};
anim();
<canvas id="canvas" width="500" height="200"></canvas>
<button id="rec">record</button>

And since you asked for a way to add audio to this video, note that you can use cStream.addTrack(anAudioStream.getAudioTracks()[0]); before calling new MediaRecorder(cStream), but this will currently only work in chrome, FF seems to have a bug in MediaRecorder which makes it record only the stream with the tracks it was defined to... A workaround for FF is to call new MediaStream([videoTrack, audioTrack]);

[big thanks to @jib for letting me know how to actually use it...]

Edit: video.onend --> video.onended

Kaiido
  • 123,334
  • 13
  • 219
  • 285
  • Thanks. I must test it on firefox (as chrome is not supporting it). What does the 30 mean in `canvas.captureStream`? Is that the framerate. – David Callanan Aug 13 '16 at 09:02
  • Yes it is, just like the comment says ;-) – Kaiido Aug 13 '16 at 09:04
  • Oh, oops! Missed that. Thanks a million. – David Callanan Aug 13 '16 at 09:05
  • Just wondering.. Does this support audio, if audio is playing on the webpage. – David Callanan Aug 13 '16 at 16:23
  • @DavidCallanan, I'm sorry, actually chrome does support recording the canvas stream too. It's just both browsers implementations are so different I though it didn't, and I might got fooled by MDN's article about it... Anyway, added a way for saving both canvas, and a small note about recording audio, which won't be possible currently for your use case... – Kaiido Aug 15 '16 at 13:42
  • To get chunks in Firefox, call [`recorder.requestData()`](https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder/requestData) to trigger `ondataavailable`. – jib Aug 19 '16 at 01:46
  • @jib, actually, reading your link, I'm not sure it would help a lot, since my horrible workaround to handle both implementations doesn't really needs the chunks, it's just that chrome does fire the ondataavailable automatically after some short time, with the chunks in it. So `requestData` would allow to produce the same behavior only by calling it manually in a timed loop right? But then, won't l get incorrect data for chrome? I hope all this will finally stabilize... – Kaiido Aug 19 '16 at 02:23
  • 1
    @Kaiido Your workaround is unnecessary. Instead of checking state in `ondataavailable`, use `recorder.onstop` instead, and it'll [work the same in both browsers](https://jsfiddle.net/tmwLxjLy/) regardless of the number of chunks. – jib Aug 19 '16 at 03:11
  • Also, this is a [w3c spec](https://w3c.github.io/mediacapture-record/MediaRecorder.html#dom-mediarecorder-requestdata), not whatwg. – jib Aug 19 '16 at 03:18
  • @jib ah that makes sense ;-) will update accordingly. The MDN article I point to in my answer is really misleading then. Also, do you know for the FF bug I mention about stream.addTrack and MediaRecorder taking only the initial tracks? Or should I open a bug report? – Kaiido Aug 19 '16 at 03:21
  • @Kaiido Yes, please [file a bug](https://bugzilla.mozilla.org/enter_bug.cgi?product=Core&component=Audio%2FVideo%3A%20Recording). Did you try `new MediaStream([canvasTrack, audioTrack])`? – jib Aug 19 '16 at 03:38
  • @jib "TypeError: Argument 1 is not valid for any of the 2-argument overloads of MediaRecorder.". Tried with both streams and with both tracks in an array, in a Blob containing both tracks/streams. Anyway, that would still be a bug that it doesn't record the manually assigned tracks. – Kaiido Aug 19 '16 at 03:55
  • 2
    @Kaiido I've updated the [MDN article](https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder_API/Using_the_MediaRecorder_API). Thanks for pointing out it was outdated! Yes please file a bug. – jib Aug 19 '16 at 05:08
  • @jib, not sure it needs an SO question either, since the only response I would get would be "file a bug" and that it's now done : https://bugzilla.mozilla.org/show_bug.cgi?id=1296531 ps: OP asked for audio in comments, and anyway, chrome doesn't work either. – Kaiido Aug 19 '16 at 05:30
1

The MediaRecorder + canvas.captureStream approach in Kaiido's answer is definitely the way to go at the moment - unfortunately as of writing it's only supported in Chrome and Firefox.

Another approach that will work when browsers adopt webp encoding support (only Chrome has it, currently) is this:

let frames = []; // <-- frames must be *webp* dataURLs
let webmEncoder = new Whammy.Video(fps); 
frames.forEach(f => webmEncoder.add(f));
let blob = await new Promise(resolve => webmEncoder.compile(false, resolve));
let videoBlobUrl = URL.createObjectURL(blob);

It uses the whammy library to join a bunch of webp images into a webm video. In browsers that support webp encoding you can write canvas.toDataURL("image/webp") to get a webp dataURL from a canvas. This is the relevant bug report for Firefox webp support.

One cross-browser approach as of writing seems to be to use libwebp.js to convert the png dataURLs output by canvas.toDataURL() into webp images, and then feeding those into the whammy encoder to get your final webm video. Unfortunately the png-->webp encoding process is very slow (several minutes for a few seconds of video on my laptop).

Edit: I've discovered that with the MediaRecorder/captureStream approach, you can end up with low-quality output videos compared to the whammy approach. So unless there's some way to control the quality of the captured stream, the whammy approach seems like the best approach, with the other two as fall-backs. See this question for more details. Use this snippet to detect whether a browser supports webp encoding (and thus supports the whammy approach):

let webPEncodingIsSupported = document.createElement('canvas').toDataURL('image/webp').startsWith('data:image/webp');