^ stands for XOR.
XORing same bit return 0, different bit return 1.
Eg. 1^0 == 1 , 1^1 == 0
Any int variable in C is 16 bit (16 bit compiler) or 32 bit (32 bit compiler). So, in any case whether it is defined or not, a will be a 16/32 bit pattern.
Considering 16 bit compiler
Bit pattern of 3 is 0000 0000 0000 0000 0011
XOR
Bit pattern of 6 is 0000 0000 0000 0000 0110
Result is --> 0000 0000 0000 0000 0101 ---> 5
It doesn't matter whether a is defined or not.
a^a will always be equal to 0. Since we have bit pattern same in both cases.
Therefore (3^6) + (a^a) = 5.
Also if question is (3^6) + (a^~a)
Then
as explained above 3^6--> 5
Considering 16 bit compiler for a as garbage value and integer type
let assume a=1.
then a will be 0000 0000 0000 0001
and ~a will be 1111 1111 1111 1110
so a^~a will be -->1111 1111 1111 1111--> 65535 (Unsigned int)
Therefore (3^6) + (a^~a) = 5+65535 =65540 which is out of range.
As a result it will exceed 5 starting from 0 which will result -->4
Answer=4