0

I've got an emoji picker in a React project and I can use it to insert emojis into a text field. Upon inserting, the emoji is visible and renders correctly. However, if another is inserted, all previous emojis are replaced with the unknown character symbol. Despite each emoji being complete and valid characters. I also tried to store the message in a React State and convert them to-and-fro the code points programatically, yet the issue persists. See Code example: https://gist.github.com/J-Cake/8ab27a809aaf0cf14a7e2b78cbcbacf2 I'm wondering if there is a simple mistake in the code or if there's something larger I'm missing.

Edit: I might add that I'm on Ubuntu where emoji support may be low, but looking at emoji test pages / the fact that the emoji does render, suggests that has nothing to do with the issue.

Edit 2: I've also found out that the issue persists in the JS console and in Firefox.

J-Cake
  • 1,526
  • 1
  • 15
  • 35

2 Answers2

0

Here is how I use emojis in React projects:

emojis = [
  '128512',
  '128514',
  '128519'
];

emojis.map(emoji => <span>{String.fromCodePoint(emoji)}</span>)

As you can see I use decimal spelling. Hex would work too I guess.

I use the following chart: https://www.w3schools.com/charsets/ref_emoji_smileys.asp

Edit:

Despite you fixed it already, here is what I propose:

insert(emoji: BaseEmoji) {
    this.setState((prev: State) => ({
      messageContent: [
        ...prev.messageContent.slice(0, prev.cursorStart),
        ...[emoji.native.codePointAt(0)],
        ...prev.messageContent.slice(prev.cursorEnd + 1),
      ],
      cursorStart: prev.cursorStart + 1,
      cursorEnd: prev.cursorStart + 1,
    }));
  }
Tobias Boertz
  • 366
  • 2
  • 6
  • I see you're using strings as the code point. However converting the code point to a string prior to the `String.fromCodePoint` causes typescript to complain. – J-Cake Mar 29 '20 at 09:36
0

I found the answer.

I was reading MDN Docs and noticed this disclaimer:

Warning: When the empty string ("") is used as a separator, the string is not split by user-perceived characters (grapheme clusters) or unicode characters (codepoints), but by UTF-16 codeunits. This destroys surrogate pairs. See “How do you get a string to a character array in JavaScript?” on StackOverflow.

I would have completely dismissed this had I not noticed something odd in the console. I had noticed that two characters where being printed per emoji. I figured this was a weird bug with unicode, and dismissed it. Turns out the fix was in the numerify function; Replace the str.split with Array.from(str) and behold a working example.

J-Cake
  • 1,526
  • 1
  • 15
  • 35