3

Code

char a; 
a = 0xf1;
printf("%x\n", a);

Output

fffffff1

printf() show 4 bytes, that exactly we have one byte in a.

What is the reason of this misbehavior?

How can i correct it?

EsmaeelE
  • 2,331
  • 6
  • 22
  • 31
  • 1
    wrong format. Try `int a` – Jean-François Fabre Aug 19 '17 at 20:40
  • 2
    `char a;` --> `unsigned char a;` – BLUEPIXY Aug 19 '17 at 20:42
  • 1
    It casted to an integer, which extended the sign – Lavaman65 Aug 19 '17 at 20:45
  • @Jean-FrançoisFabre `printf("%zu", sizeof(int));` show 4 byte, i want only one byte. – EsmaeelE Aug 19 '17 at 20:47
  • 3
    A `char` would be promoted to `int` by virtue of the integer promotions anyway.... – ad absurdum Aug 19 '17 at 20:47
  • @DavidBowling its true David. but consider `int` have 4 byte length, and `char` have only one byte, try `sizeof`. i know any characters in C cast to appropriate ASCII. – EsmaeelE Aug 19 '17 at 20:50
  • @EsmaeelE You can try `sizeof`, but by the time you finish writing a `printf`-like function in order to test the `sizeof` its argument, you could have just read the manual... In fact, you *should* read the manual. – autistic Aug 19 '17 at 20:56
  • 2
    @EsmaeelE There is no requirement in standard C that ASCII be the character set used. Consider that C also runs on machines using EBCDIC. – autistic Aug 19 '17 at 20:57
  • @EsmaeelE The comments section is for requesting and providing clarification for the question you asked. Please don't diverge so much from the original topic. I'm not your research hound. You can find that information. – autistic Aug 19 '17 at 21:12
  • @EsmaeelE-- if you just google [EBCDIC](https://en.wikipedia.org/wiki/EBCDIC) you would get the Wikipedia page I just linked to. – ad absurdum Aug 19 '17 at 21:14
  • @Seb about first your comment, I must read which manual instead of use `sizeof`? – EsmaeelE Aug 19 '17 at 21:26
  • 1
    @EsmaeelE I assume you're programming for a Unix system, since you're using a Unix programming language, so a Unix manual might be a good place to start! [Here's one](http://pubs.opengroup.org/onlinepubs/9699919799/basedefs/stdarg.h.html)... Pay careful attention to the parts about *integer promotion*! – autistic Aug 19 '17 at 21:42
  • @EsmaeelE [Here's one about a common compiler/standard library](http://www.gnu.org/software/libc/manual/html_node/Calling-Variadics.html) which says "objects of type char or short int (whether signed or not) are promoted to either int or unsigned int" and "if the caller passes a char as an optional argument, it is promoted to an int, and the function can access it with va_arg (ap, int)." – autistic Aug 19 '17 at 21:44
  • 1
    @EsmaeelE You can't safely learn C by trial and error! It has undefined behaviours which you must learn to avoid unless you want to risk being sued for losses caused by serious bugs (e.g. the heartbleed SSL bug). Fortunately the technology you're using comes with manuals which you can read... so read them *BEFORE* you resort to guessing! Perhaps your guessing might lead you to some flawed conclusion which works for your system, then gets you in trouble because it doesn't for some other... at least the manual would be your defense in court. – autistic Aug 19 '17 at 21:54
  • @Seb how to seperate, bytes with `printf()` ff_ff_ff_ff – EsmaeelE Aug 19 '17 at 22:04
  • @EsmaeelE The comments section here is not the right place to ask a new question. Nonetheless, *before* you ask that new question you should conduct adequate research or else you'll end up with a *closed, unanswered* question... Which research have you conducted? I recall learning about division in primary school; perhaps you should do some research into that (and its related friend, modulo)... – autistic Aug 20 '17 at 12:01

3 Answers3

4

printf is a variable argument function, so the compiler does its best but cannot check strict compliance between format specifier and argument type.

Here you're passing a char with a %x (integer, hex) format specifier.

So the value is promoted to a signed integer (because > 127: negative char and char is signed on most systems, on yours that's for sure)

Either:

  • change a to int (simplest)
  • change a to unsigned char (as suggested by BLUEPIXY) that takes care of the sign in the promotion
  • change format to %hhx as stated in the various docs (note that on my gcc 6.2.1 compiler hhx is not recognized, even if hx is)

note that the compiler warns you before reaching printf that you have a problem:

gcc -Wall -Wpedantic test.c
test.c: In function 'main':
test.c:6:5: warning: overflow in implicit constant conversion [-Woverflow]
 a = 0xf1;
Jean-François Fabre
  • 137,073
  • 23
  • 153
  • 219
4

What is the reason of this misbehavior?

This question looks strangely similar to another I have answered; it even contains a similar value (0xfffffff1). In that answer, I provide some information required to understand what conversion happens when you pass a small value (such as a char) to a variadic function such as printf. There's no point repeating that information here.

If you inspect CHAR_MIN and CHAR_MAX from <limits.h>, you're likely to find that your char type is signed, and so 0xf1 does not fit as an integer value inside of a char.

Instead, it ends up being converted in an implementation-defined manner, which for the majority of us means it's likely to end up with one of the high-order bits becoming the sign bit. When these values are promoted to int (in order to pass to printf), sign extension occurs to preserve the value (that is, a char that has a value of -1 should be converted to an int that has a value of -1 as an int, so too is the underlying representation for your example likely to be transformed from 0xf1 to 0xfffffff1).

printf("CHAR_MIN .. CHAR_MAX: %d .. %d\n", CHAR_MIN, CHAR_MAX);
printf("Does %d fit? %s\n", '\xFF', '\xFF' >= CHAR_MIN && '\xFF' <= CHAR_MAX ? "Yes!"
                                                                             : "No!");

printf("%d %X\n", (char) -1, (char) -1); // Both of these get converted to int
printf("%d %X\n", -1, -1);               // ... and so are equivalent to these

How can i correct it?

Declare a with a type that can fit the value 0xf1, for example int or unsigned char.

autistic
  • 1
  • 3
  • 35
  • 80
0

You should use int a instead of char a because char is unsigned and can store only 1 byte from 0 to 255. And hex number need many storage to store it, and also int storage size is 2 or 4 bytes. So it's good to use int here to store hex number.

rensothearin
  • 670
  • 1
  • 5
  • 24