6

I am streaming video over a WebSocket by sending each frame in the raw ImageData format (4 bytes per pixel in RGBA order). When I receive each frame on the client (as an ArrayBuffer), I want to paint this image directly onto the canvas as efficiently as possible, using putImageData.

This is my current solution:

// buffer is an ArrayBuffer representing a properly-formatted image
var array = new Uint8ClampedArray(buffer);
var image = new ImageData(array, width, height);
canvas.putImageData(image, 0, 0);

But it is rather slow. My theories as to why:

  • the array (which is ~1MB in size) is being copied thrice, once into the Uint8ClampedArray, once into the ImageData, and lastly into the canvas, each frame (30 times per second).

  • I am using new twice for each frame, which may be a problem for the garbage collector.

Are these theories correct and if so, what tricks can I employ to make this as fast as possible? I am willing to accept an answer that is browser-specific.

rvighne
  • 20,755
  • 11
  • 51
  • 73

1 Answers1

5

No, both your ImageData image and your TypedArray array share the exact same buffer buffer.

These are just pointers, your original buffer is never "copied".

var ctx = document.createElement('canvas').getContext('2d');

var buffer = ctx.getImageData(0,0,ctx.canvas.width, ctx.canvas.height).data.buffer;

var array = new Uint8ClampedArray(buffer);

var image = new ImageData(array, ctx.canvas.width, ctx.canvas.height);

console.log(array.buffer === buffer && image.data.buffer === buffer);

For your processing time issue, the best way would be to simply send directly the video stream to a videoElement and use drawImage.

Kaiido
  • 123,334
  • 13
  • 219
  • 285
  • Thanks, your example makes a lot of sense. Unfortunately I can't use a video element since it has to be realtime and HTML5 doesn't technically support streaming – rvighne Sep 21 '16 at 04:05
  • 1
    @rvighne Actually it is part of the specs. You should be able to get a MediaStream from a videoElement using `videoElement.captureStream()` method, currently prefixed in FF through `mozCaptureStream`. Then you should be able to send it through WebRTC, socket.io, or WebSocket. Finally, you just have to set client-side videoElement's `srcObject` to the MediaStream sent. I don't have the server-side knowledge, but here is a front-side demo converting a recorded file to a video stream : https://jsfiddle.net/usk05sfs/ – Kaiido Sep 21 '16 at 04:35
  • 1
    And if your stream does only come from sever, I think you can use MediaSource API and send chunks of the file. https://developer.mozilla.org/en-US/docs/Web/API/MediaSource – Kaiido Sep 21 '16 at 04:41