0

I'm trying to upload files from a javascript webpage, to a python-based server, with websockets.

In the JS, this is how I'm transmitting the package of data over the websocket:

var json = JSON.stringify({
    'name': name,
    'iData': image
});

in the python, I'm decoding it like this:

noJson = json.loads(message)
fName = noJson["name"]
fData = noJson["iData"]

I know fData is in unicode format, but when I try to save the file locally is when the problems begin. Say, I'm trying to upload/save a JPG file. Looking at that file after upload I see at the beginning:

ÿØÿà^@^PJFIF

the original code should be:

<FF><D8><FF><E0>^@^PJFIF

So how do I get it to save with the codes, instead of the interpreted unicode characters?

fd = codecs.open( fName, encoding='utf-8', mode='wb' ) ## On Unix, so the 'b' might be ignored
fd.write( fData)
fd.close()

(if I don't use the "encoding=" bit, it throws a UnicodeDecodeError exception)

  • I'm not a javascript expert but I think you have to encode the binary with something like base64 to upload it. – tdelaney Nov 19 '15 at 03:12
  • Have a look at the discussion here: http://stackoverflow.com/questions/1443158/binary-data-in-json-string-something-better-than-base64 – roeland Nov 19 '15 at 03:48

1 Answers1

0

Use 'latin-1' encoding to save the file.

The fData that you are getting already has the characters encoded, i.e. you get the string u'\xff\xd8\xff\xe0^@^PJFIF'. The latin-1 encoding will literally convert all codepoints between U+00 and U+FF to a single char, and fail to convert any codepoint above U+FF.

memoselyk
  • 3,993
  • 1
  • 17
  • 28
  • This is what worked me. Thanks. In my python, I updated the file pipe lines to this: `fd = codecs.open( fullName, encoding='latin-1', mode="wb" ) fd.write( fData )` and without changing the Javascript, files were saved correctly. Tried a few different images as well as a video, with all success. Thanks again. – snowwolf75 Nov 19 '15 at 23:41