4

Looking for a performant algorithm for parsing any-length hexadecimal strings into Uint8Arrays. I specifically care about use with Chrome and Firefox's engines.

A common approach used in code snippets:

function hex2bytes(string) {
  const normal = string.length % 2 ? "0" + string : string; // Make even length
  const bytes = new Uint8Array(normal.length / 2);
  for (let index = 0; index < bytes.length; ++index) {
    bytes[index] = parseInt(normal.substr(index, 2), 16); // Parse each pair
  }
  return bytes;
}

Another common (bad) approach is to split the string using the regular expression /[\dA-F]{2}/gi and iterate over the matches with parseInt.

I've found a much faster algorithm using charCodeAt:

function hex2bytes(string) {
  const normal = string.length % 2 ? "0" + string : string; // Make even length
  const bytes = new Uint8Array(normal.length / 2);
  for (let index = 0; index < bytes.length; ++index) {
    const c1 = normal.charCodeAt(index * 2);
    const c2 = normal.charCodeAt(index * 2 + 1);
    const n1 = c1 - (c1 < 58 ? 48 : 87);
    const n2 = c2 - (c2 < 58 ? 48 : 87);
    bytes[index] = n1 * 16 + n2;
  }
  return bytes;
}

Can I do better?

  • That sounds more like a question for code review than for here – mplungjan Jul 12 '21 at 06:02
  • `for (let index = 0, len = bytes.length; index < len; ++index) {` might shave off half a nanosecond – mplungjan Jul 12 '21 at 06:04
  • @mplungjan I'm not as familiar with code review, but I would think that since this question is asking for a whole new algorithm rather than a review of any of the given ones, it doesn't seem like a good fit. – Shelvacu Jul 12 '21 at 06:11
  • 1
    Possibly. Anyway, my suggestion did not seem to make any difference: https://imgur.com/a/afkMAfl – mplungjan Jul 12 '21 at 06:14

0 Answers0