4

In a web application, I have to display a special unicode character, know as BLACK DIAMOND (U+25C6) (see here for more details). Here is a sample :

The font defined for the page is Arial, with size 13px.

Surprisingly, the character is rendered with a bigger size in IE6 vs other browsers (FF, Chrome, ...).

Is there any reason of this weird behavior and what is the solution to avoid this ?

Joey
  • 344,408
  • 85
  • 689
  • 683
tigrou
  • 4,236
  • 5
  • 33
  • 59

2 Answers2

3

This is because the specified character is missing from the font you specified. So the browser looks for a suitable font to use for that character so it can still display it. Different browsers pick different fonts, so you'll see it a little bit differently on each.

There isn't much in general you can do to avoid this because missing fonts are very common on the web and thus you cannot really rely on any font to be present on the user's machine. You can try mitigating it, though:

  1. You can put the U+25C6 in a span that's styled to use a different but specific font that has the character (and which works well with your main font).
  2. Same as above, but distribute a web font (WOFF seems to be a reasonable choice nowadays) that contains the glyph. That way you have more control about what is displayed.
  3. Stay far, far away from specifying fallback fonts like Arial Unicode MS. Just don't use them at all.
  4. If you're just after the looks of U+25C6 and don't care about having it actually in text form you can use an image or a CSS hack.
Joey
  • 344,408
  • 85
  • 689
  • 683
  • You are right. I thought BLACK DIAMOND was defined in Arial but it is not true, only BLACK DIAMOND SUIT is defined (which looks almost the same, i didnt take time to read character description in Character Map first time, only saw a diamond and thought it was there...). – tigrou Jul 02 '12 at 11:21
  • Oh, indeed, using another, similar character, works too, of course ;) – Joey Jul 02 '12 at 11:24
  • I ended up using solution 1. I use black diamond to show milestones on the page and using black diamond suit for this is just looks butt ugly :D – tigrou Jul 02 '12 at 11:39
0

After being puzzled for a while, I realized that browsers render Unicode characters with different fonts depending on the order of the characters. Here's an example using the N-Ary Union (U+22C3), (U+1D54A), and ℝ (U+211D):

<p>⋃</p>
<p>⋃</p>
<p>ℝ⋃</p>
<p>⋃ℝ</p>

On my Mac, Chrome renders the first paragraph with STIXGeneral, the second paragraph with Apple Symbols and STIXGeneral, the third paragraph with Menlo and Apple Symbols, and the fourth paragraph only with Apple Symbols. Firefox renders everything with STIXGeneral except the ℝ in the third paragraph, which it renders with Geneva.

(Chrome shows the rendered fonts at the bottom of the Computed tab when inspecting an element with the developer tools. Firefox has a Fonts tab when inspecting an element. I couldn't find anything similar for Safari, which is confirmed by this answer.)

As far as I can tell, this is a simple optimization: If a glyph exists in a font already loaded for a particular "text node", use this font. Otherwise, search for another font which can render this glyph. Interestingly, I observed the same behavior (large ⋃ after and small ⋃ before ) also in Visual Studio Code and Apple Pages.

This optimization has a subtle security implication: If you print out a document and black out some text, the rendering of the later characters can reveal information about the blacked-out text.

(I put "text node" in quotation marks because <span></span>⋃ leads to the same behavior while <span style="font-weight: bold;"></span>⋃ does not.)

Samuel Liew
  • 76,741
  • 107
  • 159
  • 260
Kaspar Etter
  • 3,155
  • 1
  • 14
  • 21