Case 1 :
You are getting 'a'*256²+'s'*256+'b' = 6386530
because 'a' = 97
, 's' = 115
, 'b' = 98
cf. Ascii table
'asb' is interpreted as an integer.
typeid('asb').name()[0] == 'i' && sizeof('asd') == 4;
An integer is 32 bits, and you can store 'asb' (24bits) in an integer.
That's why std::cout interprets it as an integer and display 6386530
Note that also:
typeid('xxxxabcd').name()[0] == 'i' && sizeof('xxxxabcd') == 4;
but 'xxxxabcd' is represented by 64-bits, so 32-bits are lost.
std::cout << 'xxxxabcd';
std::cout << 'abcd';
would print the same thing.
Case 2 :
'asb' is interpreted as an integer and you cast it into a char (8-bits).
As @BenjaminJones pointed out, only the last 8-bits (98=='b') are saved.
And std::cout interprets it as a char so it displays 'b'.
Anyway, both case provokes compilation warning such as :
warning: multi-character character constant [-Wmultichar]
warning: multi-character character constant [-Wmultichar] In function 'int main()'
warning: overflow in implicit constant conversion [-Woverflow]
I guess the behavior depends on the compiler.