Maybe this will clear things up.
C89 introduced a new integer type, wchar_t
. This is similar to a char, but typically "wider". On many systems, including Windows, a wchar_t
is 16 bits. This is typical of systems that implemented their Unicode support using earlier versions of the Unicode standard, which originally defined fewer than 65,535 characters. Unicode was later expanded to support historical and special purpose character sets, so on some systems, including Mac OS X and iOS, the wchar_t
type is 32 bits in size. This is often poorly documented, but you can use a simple test like this to find out:
// how big is wchar_t?
NSLog(@"wchar_t is %u bits wide", 8 * sizeof(wchar_t));
On a Mac or iPhone, this will print "wchar_t is 32 bits wide". Additionally, wchar_t
is a typedef for another integer type in C. In C++, wchar_t
is a built-in integer type. In practice, this means you need to #include in C when using wide characters.
Ref: http://blog.ablepear.com/2010/07/objective-c-tuesdays-wide-character.html