-1

TextDecoder.decode does not respect the "ascii" label parameter passed to the constructor according to my understanding of it. The example below provides the same values on Chromium and Node 18.17.1. Can someone explain why the TextDecoder is providing a different value than the fromCharCode function?

Code

const array = new Uint8Array([173, 148, 130, 203, 88]);
console.log(array);
const decoder = new TextDecoder("ascii");
const value = decoder.decode(array);
console.log(`Length: ${value.length}`)
console.log("Uint8|fromCharCode|TextDecoder")
for (i in value){
    console.log(`${array[i]}|${String.fromCharCode(array[i]).charCodeAt(0)}|${value.charCodeAt(i)}`);
    console.log(String.fromCharCode(array[i]))
}

Output

Uint8|fromCharCode|TextDecoder
173|173|173
148|148|8221
130|130|8218
203|203|203
88|88|88
jemartin80
  • 498
  • 2
  • 17
  • look at the codepage layout at https://en.wikipedia.org/wiki/Windows-1252 (ascii coding uses Windows-1252 according to docs) ... you will see that Ascii 148 and 130 map to 0x201D and 0x201A (8221 and 8218 decimal) – Jaromanda X Aug 10 '23 at 00:32
  • Thanks. Looking at that, it seems like the mapping of Windows 1252 to ISO-8859-1 is what’s taking place. But based on https://stackoverflow.com/questions/19109899/what-is-the-exact-difference-between-windows-1252-and-iso-8859-1#:~:text=Windows%2D1252%20ISO%20Latin%201,punctuation%20characters)%2C%20others%20are%20left these are not the same. – jemartin80 Aug 10 '23 at 01:36

0 Answers0