-2

This is my code:

#include<stdio.h>

void main()
{
    unsigned x = 1;
    signed char y = -1;

    if(x>y) printf("x>y");
    else printf("x<=y");

}

The output of the code when compiled on gcc is "x<=y"

Please explain anyone.

Dave
  • 5,108
  • 16
  • 30
  • 40

1 Answers1

1

In x>y, we have an unsigned and a signed char. The rules of C say that the types are made to match by converting the signed char to an unsigned value, and then the comparison is performed.

When −1 is converted to unsigned, the result cannot be −1, of course, because unsigned cannot represent negative values. The rules of C say that a conversion of a negative value to unsigned is performed by adding UINT_MAX+1 to the value (as many times as necessary to make a non-negative value). For example, if UINT_MAX is 65535, then 65536 is added. So the result of converting −1 to unsigned is −1 + 65536 = 65535.

Thus, the comparison becomes 1>65535, which is false. (UINT_MAX can also be larger, such as 4,294,967,295, in which case the comparison becomes 1>4294967295, which is also false.)

Eric Postpischil
  • 195,579
  • 13
  • 168
  • 312
  • I would add that you should be explicit about the type of unsigned (unsigned char, unsigned int, unsigned short, unsigned long). Also, you should really only do arithmetic between the same type, or if that is not practical, cast to make the conversion explicit: for example, average= (double)x/(double)y; – JWDN Feb 16 '19 at 16:35