a is a signed int and contains -100 b is an unsigned int and contains 500
a<b returns FALSE!!
Why on earth? :P
I can cast b to signed int and get the correct results, but leaving as is instead the result surprises me a lot, as I have no idea why -100<500 should be false, it's like if the compiler casts a to an unsigned type automatically (and this was clearly not requested by the programmer).
If we keep them as they are, i.e. the first signed and the second unsigned, then why should a
This is very confusing really.
Now I have to correct all of my code, looking for comparisons between signed and unsigned ints, and cast both variables to the type I mean. :-/
Is there any other situation I have to be careful about when mixing signed and unsigned types?
Please do not reply the obvious "generically the use of unsigned types is not adviceable, why don't you stick with only signed types? you will be much safer". THANKS.
Cheers.