I know various incarnations of this question have been asked and answered on stack overflow many times, but the closest matches that I could find focus only on unicode combinations (e.g.: character + diacritic), which end up side-stepping the real issue for me at least regarding how unicode string length is computed.
My example strings do not contain any combinatory characters:
http://static.inky.ws/image/5413/Screen%20Shot%202015-11-07%20at%2012.05.05%20AM.png
Why does -[NSString length] tell me that each of these strings (each of which, again, is just a single printable character with no extra combinatory characters) has a length of 2?
Apple's documentation for -[NSString length]
says:
The number of Unicode characters in the receiver. This number includes the individual characters of composed character sequences, so you cannot use this property to determine if a string will be visible when printed or how long it will appear.
By that definition, it seems reasonable to conclude that each of these test strings has a length of 1.
Obviously I'm wrong. I just need someone to explain why.
(This question is not a duplicate of "Number of characters in a string (not number of bytes)", because the OP in that thread was not asking why.)