After being puzzled for a while, I realized that browsers render Unicode characters with different fonts depending on the order of the characters. Here's an example using the N-Ary Union (U+22C3
), (U+1D54A
), and ℝ (U+211D
):
<p>⋃</p>
<p>⋃</p>
<p>ℝ⋃</p>
<p>⋃ℝ</p>
On my Mac, Chrome renders the first paragraph with STIXGeneral
, the second paragraph with Apple Symbols
and STIXGeneral
, the third paragraph with Menlo
and Apple Symbols
, and the fourth paragraph only with Apple Symbols
. Firefox renders everything with STIXGeneral
except the ℝ in the third paragraph, which it renders with Geneva
.
(Chrome shows the rendered fonts at the bottom of the Computed
tab when inspecting an element with the developer tools. Firefox has a Fonts
tab when inspecting an element. I couldn't find anything similar for Safari, which is confirmed by this answer.)
As far as I can tell, this is a simple optimization: If a glyph exists in a font already loaded for a particular "text node", use this font. Otherwise, search for another font which can render this glyph. Interestingly, I observed the same behavior (large ⋃ after and small ⋃ before ) also in Visual Studio Code and Apple Pages.
This optimization has a subtle security implication: If you print out a document and black out some text, the rendering of the later characters can reveal information about the blacked-out text.
(I put "text node" in quotation marks because <span></span>⋃
leads to the same behavior while <span style="font-weight: bold;"></span>⋃
does not.)