Yes it's a builtin compiler mechanism for @""
string literals. There's no operator overloading involved here. The compiler assigns a __NSCFConstantString
object that's already initialised when the Mach-O __DATA
segment is processed during the program loading phase. You can verify this in this simple program:
NSString* globalString = @"globalString";
int main(int argc, const char * argv[]) {
NSString* localString = @"localString";
static NSString* staticString = @"staticString";
NSString *heapString = [NSString stringWithFormat:@"heapStringExample"];
return 0; //put breakpoint here
}
Using lldb:
(lldb) p localString
(__NSCFConstantString *) $1 = 0x0000000100001050 @"localString"
(lldb) p staticString
(__NSCFConstantString *) $2 = 0x0000000100001070 @"staticString"
(lldb) p globalString
(__NSCFConstantString *) $3 = 0x0000000100001030 @"globalString"
(lldb) p heapString
(__NSCFString *) $4 = 0x0000000103307f50 @"heapStringExample"
(lldb) memory region 0x0000000100001050
[0x0000000100001000-0x0000000100002000) rw- __DATA
(lldb) memory region 0x0000000103307f50
[0x0000000103300000-0x0000000103400000) rw-
You may notice localString
, staticString
and globalString
all share memory range in the __DATA
segment. That's opposed to heapString
allocated on the heap.
This is a good article exploring the NSString
class cluster internal implementations.