Im trying to get this remote binary file to read the bytes, which (of course) are supossed to come in the range 0..255. Since the response is given as a string, I need to use charCodeAt to get the numeric values for every character. I have come across the problem that charCodeAt returns the value in UTF8 (if im not mistaken), so for example the ASCII value 139 gets converted to 8249. This messes up my whole application cause I need to get those value as they are sent from the server.
The immediate solution is to create a big switch that, for every given UTF8 code will return the corresponding ASCII. But i was wondering if there is a more elegant and simpler solution. Thanks in advance.