14

I am using websockify to display images from a python server to a HTML5 canvas.

I think that I have manage to successfully send images from my python server but I am not able to display the images to my canvas.

I think the problem has to do with the number of bytes that I am trying to display on canvas and I believe that I am not waiting until the whole image is received and then displaying the image to the canvas.

Until now I have:

The on message function. When I sent an image I get 12 MESSAGERECEIVED in console

  ws.on('message', function () {
    //console.log("MESSAGERECEIVED!")
            msg(ws.rQshiftStr());
  });

The msg function where I receive the string and I am trying to display it on canvas. I invoking the method 12 times for each picture. The format of the sting is 'xÙõKþ°pãüCY :

function msg(str) {
        //console.log(str);
        console.log("RELOAD");

        var ctx = cv.getContext('2d');
        var img = new Image();
        //console.log(str);
        img.src = "data:image/png;base64," + str;
        img.onload = function () {
            ctx.drawImage(img,0,0);
        }
    }

Any suggestions on how to fix this?

glarkou
  • 7,023
  • 12
  • 68
  • 118
  • You shouldn't transmit images as base64 over websockets unless the client doesn't support the hybi drafts. Base64 will incur a 33% overhead. Sending as binary is therefore preferable. – einaros Feb 15 '12 at 11:13
  • I think `websockify` is converting everything transferred through it to base64. – glarkou Feb 15 '12 at 11:15
  • If that's the case, that's terrible. Look for another websocket implementation. – einaros Feb 15 '12 at 12:18
  • @einaros, I address the base64 encode/decode more fully in my answer below, but in summary, the focus of websockify is transparently enable binary data across most browsers in the wild (not just modern ones). BTW, good work on http://github.com/einaros/ws. I plan on updating my node implementation of websockify (the primary is python but I have several other example implementations) to use your lib at some point in the near future. – kanaka Feb 15 '12 at 14:50

1 Answers1

13

The focus of websockify+websock.js is to transparently support streaming binary data (more on that below). The data you get off the receive queue is already base64 decoded. However, the data URI scheme is expecting a base64 encoded string so you need to encode the image data to base64. You can use the builtin window.btoa() to base64 encode a binary coded string:

img.src = "data:image/png;base64," + window.btoa(str);

Or, for greater efficiency you can use the Base64 module from include/base64.js but you will need to pass it an array of bytes as your would get from rQshiftBytes:

msg(ws.rQshiftBytes());

img.src = "data:image/png;base64," + Base64.encode(data);  // str -> data

Regarding the use of base64 in websockify:

Websockify uses base64 to encode the data to support browsers which don't support binary data directly. In addition to such popular Hixie only browsers such as iOS Safari and desktop Safari, some browsers versions in the wild use HyBi but are missing binary support. And unfortunately, in the case of Chrome, they also had a WebIDL bug around that same time which prevents detecting binary support before making a connection.

Also, the main option for using WebSockets on Opera, firefox 3.6-5, IE 8 and 9 is web-socket-js. web-socket-js supports HyBi but does not have binary support and probably won't because most of the older browsers that it targets don't support native binary types (Blob and Typed Arrays).

The market share of browsers which support HyBi and binary data is currently pretty low. However, in the future, Websockify will detect (either by object detection or browser version detection) whether the browser supports binary data and use that support directly without the need for base64 encode/decode.

The latency (and CPU) overhead of base64 encode/decode is pretty low and usually washed out by network latencies anyhow. The bandwidth overhead of base64 encoded data is about 25% (i.e. raw data becomes 33% larger), but it does give you binary data over WebSockets on basically all browsers in the wild (with the web-socket-js polyfill/shim which is used transparently by websockify if needed).

kanaka
  • 70,845
  • 23
  • 144
  • 140
  • Thanks for the detailed result once again. Can you please provide an example on how can I determined if an image is fully transferred? More specifically in python I send the data using a regular socket with buffer size 1024. If I have an image which is bigger than the buffer then it will be send in parts of 1024. In the browser this will trigger 12 times(if I have an image of 12kb) the websocket message event. How can I combine all of these messages to rebuilt the image? Thanks once again. – glarkou Feb 15 '12 at 15:52
  • WebSocketServer.send_frames(bufs) takes a list of "buffers". Each buffer will be delivered to the client (browser) as a single message (one fire of onmessage). If you shift off the everything from the receive queue in onmessage then you will get whole messages. If you are calling send_frames with smaller chunks (or bypassing send_frames entirely), then you will need to add to your protocol some mechanism for reframing messages (i.e. adding a length prefix before each message/image). But this is duplicating WebSocket functionality, so I suggest using send_frames with whole messages/images. – kanaka Feb 15 '12 at 16:11
  • Unfortunately, I cannot understand on how to do that. At the moment I have a python server sending images on `Server 1` and I have `websockify` running on `Server 2`. When I receive the messages I tried to do `var arr = ws.rQshiftBytes(ws.rQlen()), str="", chr; while (arr.length > 0) { chr = arr.shift(); str += String.fromCharCode(chr); } //console.log(str); msg(str);` Unfortunately. `ws.on('message', function ()` is called for every buffer sent from the server thus I am not able to determine when a picture is ready in order to draw it on canvas. – glarkou Feb 15 '12 at 16:23
  • Ah, I thought you were incorporating websocket.py from websockify as part of your python server. When your run websockify as a WebSocket to TCP bridge you are stuck with streaming/fragmentation (just the nature of TCP) so you must do must frame/message handling yourself between both endpoints. You can either combine websocket.py into your python server and use send_frames directly, or add framing to your 'protocol'. If the only thing you are sending in PNG images, you could parse the length out. Otherwise, you'll have to add your own framing from the python server. – kanaka Feb 15 '12 at 18:03
  • Since this is becoming a conversation, perhaps you would consider filing an issue with websockify to continue this there? – kanaka Feb 15 '12 at 18:03
  • For those interested in following the conversation: https://github.com/kanaka/websockify/issues/30 – kanaka Feb 15 '12 at 18:57
  • 'However, in the future, Websockify will detect (either by object detection or browser version detection) whether the browser supports binary data' - is it already built in? How is currently the general Browser/Javascript support for binary frames over WebSocket? – Zsolt Mar 07 '13 at 10:19
  • The latest version of every browser supports binary data via WebSockets. The web-socket-js fallback does not support binary data (e.g. for IE9). Unfortunately my pleas to make binary data easily object detectable didn't bear fruit. So in order to detect it you have to actually create a WebSocket connection to some high port on localhost and then test the resulting object. – kanaka Mar 07 '13 at 13:33