I have custom print functions I use to print numbers. I made an ASCII version and a UTF-16LE version. The UTF-16LE version uses the Fullwidth codes/characters for 0-9 and A-F for hexadecimal. When debugging my functions I noticed the characters looked a little different in Visual Studio than the ASCII characters, and while this didn't bother me, it got me thinking about it. So I decided to do a quick google search for "Unicode halfwidth vs fullwidth"
... And I found several pages that talk about the "Fullwidth" form referring to the Visual width of the characters, while I thought "Fullwidth" referred to the width of the encoding (2 Bytes or more)...
Here are a few pages and quotes from them:
- https://en.wikipedia.org/wiki/Halfwidth_and_fullwidth_forms
- ICU Unicode Normal vs Fullwidth
To make things line up neatly, IBM defined a set of 'full-width' (better would have been 'double-width') letters and numbers.
- https://en.wikipedia.org/wiki/Half-width_kana
Half-width kana are katakana characters displayed at half their normal width (a 1:2 aspect ratio), instead of the usual square (1:1) aspect ratio. For example, the usual (full-width) form of the katakana ka is カ while the half-width form is カ.
It doesn't make sense to me that "Fullwidth" would refer to the visual width, when we have different Fonts for size and alignment.
Why does "Fullwidth" refer to the visual width? Where in the Unicode UTF-16 spec does it say this?
Is having the choice to output as Halfwidth or Fullwidth using flags be desirable?