0

Recently in CCS standard library i have seen something like this

function is taking unsigned int argument.

delay(unsigned int)

This was used like this way

delay(~(0));

Now how the (~(0)) will be interpreted ?

if i used like this way

printf("%d",(~(0)));

How its treated signed int? unsigned int? long int? or unsigned char?

How its depended on system? What c spec says for this?

Jeegar Patel
  • 26,264
  • 51
  • 149
  • 222
  • In `printf`, it all depends on format specifier. However you must be careful that illegal conversions are undefined (e.g: integer to float, vice-versa) – 0xF1 Jun 24 '14 at 05:49
  • In `printf()`, it doesn't matter what the paramter ist, it doesn't even matter what the format specifier is, since parameters of variadic functions are subject to integer promotion. `printf("%d",(char)(~(0)))`, `printf("%d", (short)(~(0)))`, `printf("%d", (int)(~(0)))` will all print the same type, an integer. – mfro Jun 24 '14 at 06:00

3 Answers3

5

Recall that ~ is the bitwise NOT operator, and that 0 is an integer, sizeof(int) bytes in size.

Since 0 is all bits set to zero, (~0) is all bits set to 1.

If (like most systems), your sizeof(int)==4, then (~0) == 0xFFFFFFFF.

Basically, this is delaying for the maximum value possible (assuming delay(unsigned int))


#include <stdio.h>
#include <limits.h>

int main(void)
{
    printf("sizeof(0)=%d sizeof(~0)=%d\n", sizeof(0), sizeof(~0));
    printf("0x%X\n", (~0));
    printf("%d\n", (~0)==UINT_MAX);
    return 0;
}

Output:

$ ./a.out
sizeof(0)=4 sizeof(~0)=4
0xFFFFFFFF
1
Jonathon Reinhart
  • 132,704
  • 33
  • 254
  • 328
  • you said "0 is an integer" how u can say that? any standard? – Jeegar Patel Jun 24 '14 at 05:53
  • Sure, someone could dig up the spec... but I mean, come on. *It is an integer* - what else could it be? – Jonathon Reinhart Jun 24 '14 at 05:55
  • i doubt that integer literal should be as smallest type as that can represent that value. Think if i use literal value more then int max value then it would be treated as int? or something else? – Jeegar Patel Jun 24 '14 at 05:59
  • See [Type of integer literal not int by default?](http://stackoverflow.com/q/8108642/119527) – Jonathon Reinhart Jun 24 '14 at 06:03
  • Agreed now..Thanks!! We are So users we understand "spec" not "come on...blha blah.." :) – Jeegar Patel Jun 24 '14 at 06:06
  • Maybe noteworthy, that `(~0)` has type `int` and passing it to a function accepting `unsigned int`, the value will be converted, which changes the value if signed values aren't represented as a two's complement (if I'm not horribly mistaken atm). `(~0u)` would be correct in this case. – mafso Jun 24 '14 at 12:06
0

By default, all integer constants are treated as signed int. However when you print it using printf, then the argument will be typecasted/promoted according to the format specifier.

In delay(~(0)); I assume delay() will always expect a unsigned int argument hence ~0 == 0xffffffff (since int is 4 bytes, hence 0xffffffff) will be type promoted to unsigned int.

0xF1
  • 6,046
  • 2
  • 27
  • 50
0

~0 is an int with all 1 bits (since the unary ~ is the bitwise not operator). On most machines, it is -1 on two's complement machines like x86

So printf("%d\n", ~0); outputs -1.

Basile Starynkevitch
  • 223,805
  • 18
  • 296
  • 547