If I want to display a special character like ⬀, ⤴ or ➶, How can I see if the user's browser can display it? I don't want the character to come out as ▯; if it would, I'd rather display a normal ↗.
So how can I ensure that whatever character I emit is displayed correctly?
I found this answer to this question, which unfortunately doesn't work in my situation.
The trick there is to put a character that doesn't exist in the DOM tree, so you know it won't display, and to compare its width to the width of your desired character. If the widths are the same, chances are your desired character is also undefined and it won't display either.
However, this trick doesn't work in all situations. The linked answer uses U+FFFD for a non-displaying character, but on some devices (in my case, a Windows Mobile phone) the U+FFFD is a defined character, that has a different width than the glyph used for undefined characters (� vs ▯). Also, codepoints that are known non-characters, such as U+FFFE, are displayed as question marks (?) rather than ▯s, so those have different widths too.
So my question is, what can I compare my character against so that I'll know it will not be a non-displayable character?