4

I have, for example, this emoji:


... and i want to get your char code. i tried this:

var emoji = ""
var char_code = emoji.charCodeAt()
console.log(emoji,char_code)

But when using String's fromCharCode method, I don't get the original emoji:

var emoji = ""
var char_code = emoji.charCodeAt()
var original = String.fromCharCode(char_code)
console.log(original)

How can I get the original emoji from your char code? Or else how can I get your actual char code to use it in String.fromCharCode?

ArtEze
  • 186
  • 1
  • 5
  • 17

1 Answers1

7

String.codePointAt() and String.fromCodePoint() are made for this, though it is also possible to specify a UTF-16 charCode in legacy browsers using surrogate keypairs (see the discussion here).

let emoji = "",
    charCode = emoji.charCodeAt(),
    codePoint = emoji.codePointAt();

console.log(
    'charCode:', charCode,
    String.fromCharCode(charCode)   // doesn't work =(
)

console.log(
    'codePoint:', codePoint,
    String.fromCodePoint(codePoint) // there we go!
);
Lewis
  • 4,285
  • 1
  • 23
  • 36