MSDN https://msdn.microsoft.com/en-us/library/69ze775t.aspx described VS character literal as follows:
Multiple characters in the literal fill corresponding bytes as needed from high->order to low-order. To create a char value, the compiler takes the low-order byte.
According to the document, the following result should be what I want.
unsigned int i = '1234'; // i = 0x34333231, memory[low->high] [0x31, 0x32, 0x33, 0x34]
However, when escape sequences appears, things changed, here's the result I did on my pc.
unsigned int i = '1234'; // i = 0x34333231
unsigned int j = '\1\2\3\4\'; // j = 0x01020304 <- ???????
Wait a minute, what just happened here? I'm looking forward to that variant j is 0x04030201. Where's the high-order to low-order thing? I can't figure it out by myself.
This should be my first question. Why the compiler doesn't fill the memory from high-order to low-order when octal escape occur?
Yet it's not the whole story, I'll show something more interesting here
unsigned int k = '0\101\1001'; // k = 0x31403041 memory[low->high] [0x41 0x30 0x40 0x31] ??what the hell
unsigned int l = '0\1011\100'; // l = 0x40304131 memory[low->high] [0x31 0x41 0x30 0x40] ??what the hell again
So far, I'm totally lost. I can't even conclude a simple rule from these test cases.
Doesn't anybody know something about this issue? Thanks.