Based on this interesting question: Addition of int and uint and toying around with constant folding as mentioned in Nicholas Carey's answer, I've stumbled upon a seemingly inconsistent behavior of the compiler:
Consider the following code snippet:
int i = 1;
uint j = 2;
var k = i - j;
Here the compiler correctly resolves k
to long
. This particular behavior is well defined in the specifications as explained in the answers to the previously referred question.
What was surprising to me, is that the behavior changes when dealing with literal constants or constants in general. Reading Nicholas Carey's answer I realized that the behavior could be inconsistent so I checked and sure enough:
const int i = 1;
const uint j = 2;
var k = i - j; //Compile time error: The operation overflows at compile time in checked mode.
k = 1 - 2u; //Compile time error: The operation overflows at compile time in checked mode.
k
in this case is resolved to Uint32
.
Is there a reason for the behavior being different when dealing with constants or is this a small but unfortunate "bug" (lack of a better term) in the compiler?