I have data stored in an SQLite
database as BINARY(16)
, the value of which is determined by PHP
's hex2bin
function on a 32-character hexadecimal string.
As an example, the string 434e405b823445c09cb6c359fb1b7918
returns CN@[4EÀ¶ÃYûy
.
The data stored in this database needs to be manipulated by JavaScript
, and to do so I've used the following function (adapted from Andris's answer here):
// Convert hexadecimal to binary string
String.prototype.hex2bin = function ()
{
// Define the variables
var i = 0, l = this.length - 1, bytes = []
// Iterate over the nibbles and convert to binary string
for (i; i < l; i += 2)
{
bytes.push(parseInt(this.substr(i, 2), 16))
}
// Return the binary string
return String.fromCharCode.apply(String, bytes)
}
This works as expected, returning CN@[4EÀ¶ÃYûy
from 434e405b823445c09cb6c359fb1b7918
.
The problem I have, however, is that when dealing directly with the data returned by PHP
's hex2bin
function I am given the string CN@[�4E����Y�y
rather than CN@[4EÀ¶ÃYûy
. This is making it impossible for me to work between the two (for context, JavaScript
is being used to power an offline iPad app that works with data retrieved from a PHP
web app) as I need to be able to use JavaScript
to generate a 32-character hexadecimal string, convert it to a binary string, and have it work with PHP
's hex2bin
function (and SQLite
's HEX
function).
This issue, I believe, is that JavaScript
uses UTF-16
whereas the binary string is stored as utf8_unicode_ci
. My initial thought, then, was that I need to convert the string to UTF-8
. Using a Google search led me to here and searching StackOverflow led me to bobince's answer here, both of which recommend using unescape(encodeURIComponent(str))
. However, this does return what I need (CN@[�4E����Y�y
):
// CN@[Â4EöÃYûy
unescape(encodeURIComponent('434e405b823445c09cb6c359fb1b7918'.hex2bin()))
My question, then, is:
How can I use JavaScript
to convert a hexadecimal string into a UTF-8
binary string?