I have following code
char temp[] = { 0xAE, 0xFF };
printf("%X\n", temp[0]);
Why output is FFFFFFAE
, not just AE
?
I tried
printf("%X\n", 0b10101110);
And output is correct: AE
.
Suggestions?
I have following code
char temp[] = { 0xAE, 0xFF };
printf("%X\n", temp[0]);
Why output is FFFFFFAE
, not just AE
?
I tried
printf("%X\n", 0b10101110);
And output is correct: AE
.
Suggestions?
The answer you're getting, FFFFFFAE
, is a result of the char
data type being signed. If you check the value, you'll notice that it's equal to -82, where -82 + 256 = 174, or 0xAE
in hexadecimal.
The reason you get the correct output when you print 0b10101110
or even 174 is because you're using the literal values directly, whereas in your example you're first putting the 0xAE value in a signed char
where the value is then being sort of "reinterpreted modulo 128", if you wanna think of it that way.
So in other words:
0 = 0 = 0x00
127 = 127 = 0x7F
128 = -128 = 0xFFFFFF80
129 = -127 = 0xFFFFFF81
174 = -82 = 0xFFFFFFAE
255 = -1 = 0xFFFFFFFF
256 = 0 = 0x00
To fix this "problem", you could declare the same array you initially did, just make sure to use an unsigned char
type array and your values should print as you expect.
#include <stdio.h>
#include <stdlib.h>
int main()
{
unsigned char temp[] = { 0xAE, 0xFF };
printf("%X\n", temp[0]);
printf("%d\n\n", temp[0]);
printf("%X\n", temp[1]);
printf("%d\n\n", temp[1]);
return EXIT_SUCCESS;
}
Output:
AE
174
FF
255
https://linux.die.net/man/3/printf
According to the man page, %x
or %X
accept an unsigned integer
. Thus it will read 4 bytes from the stack.
In any case, under most architectures you can't pass a parameter that is less then a word
(i.e. int
or long
) in size, and in your case it will be converted to int.
In the first case, you're passing a char
, so it will be casted to int
. Both are signed, so a signed cast is performed, thus you see preceding FF
s.
In your second example, you're actually passing an int
all the way, so no cast is performed.
If you'd try:
printf("%X\n", (char) 0b10101110);
You'd see that FFFFFFAE
will be printed.
When you pass a smaller than int
data type (as char
is) to a variadic function (as printf(3)
is) the parameter is converted to int
in case the parameter is signed
and to unsigned int
in the case it is unsigned. What is being done and you observe is a sign extension, as the most significative bit of the char
variable is active, it is replicated to the thre bytes needed to complete an int
.
To solve this and to have the data in 8 bits, you have two possibilities:
Allow your signed char
to convert to an int (with sign extension) then mask the bits 8 and above.
printf("%X\n", (int) my_char & 0xff);
Declare your variable as unsigned
, so it is promoted to an unsigned int
.
unsigned char my_char;
...
printf("%X\n", my_char);
This code causes undefined behaviour. The argument to %X
must have type unsigned int
, but you supply char
.
Undefined behaviour means that anything can happen; including, but not limited to, extra F's appearing in the output.