1

I want to convert char type to int type without losing the signed meaning, so i write the code in file int_test.c and it works:

#include <stdlib.h>
#include <stdio.h>
#include <stdint.h>

#define c2int(x) \
({                         \
        int t;             \
        if (x > 0x80)      \
                t = x | (1 << sizeof(int) * 8) - (1 << sizeof(char) * 8); \
        else               \
                t = x;     \
        t;                 \
 })

int main()
{
        uint8_t a = 0xFE;
        int b;

        b = c2int(a);

        printf("(signed char)a = %hhi, b = %d\n", a, b);

        exit(EXIT_SUCCESS);
}

the running result is:

(signed char)a = -2, b = -2

The compiling log is:

gcc -o int_test int_test.c int_test.c: In function ‘main’: int_test.c:9:15: warning: left shift count >= width of type [-Wshift-count-overflow] t = x | (1 << sizeof(int) * 8) - (1 << sizeof(char) * 8); \ ^ int_test.c:20:6: note: in expansion of macro ‘c2int’ b = c2int(a);

My question is: 1. Is there simple and efficient converting? 2. How to determine the signed expansion when simply convert char to int? 3. How to avoid the above warning?

Thank you.

3 Answers3

6

You are doing manual, explicit sign conversion. Don't do that. Instead:

static int c2int(unsigned char x)
{
    return (signed char)x;
}

That does sign extension for you, and doesn't produce warnings.

John Zwinck
  • 239,568
  • 38
  • 324
  • 436
  • 3
    On some platforms `char`is unsigned, so for improved portability it should be `return (signed char)x;` – Klas Lindbäck Sep 14 '16 at 12:05
  • Isn't the cast to signed from unsigned implementation defined? Or even UB in case of trap values? – LPs Sep 14 '16 at 12:09
  • 1
    @LPs implementation-defined "When a value with integer type is converted to another integer type ... Otherwise, the new type is signed and the value cannot be represented in it; either the result is implementation-defined or an implementation-defined signal is raised." C11 6.3.1.3 – chux - Reinstate Monica Sep 14 '16 at 12:14
  • 1
    @LPs Luckily, in all mainstram implementations it will work. But I'm sure there are old (like the 1970ies) platforms where it won't. It would be worth it to point out in the answer that it depends on implementation-defined behaviour, though. – Klas Lindbäck Sep 14 '16 at 12:39
  • 1
    @Klas Lindbäck The issue about modern vs 1970s computers does not consider _modern_ compilers on _modern_ platforms.. Many have been known to take advantage of ID behavior and UB to to create highly optimized code. Relying on a causal understanding of a compiler's ID can lead to code not behaving as desired - hence the greater concern in 2016 to avoid ID behavior (as well as UB). – chux - Reinstate Monica Sep 14 '16 at 14:37
  • 1
    @chux I agree. You need to know what you are doing and be willing to do the work of checking the manual for each platform that you support. You have to weigh that effort against the effort to write and maintain code that doesn't use any ID (or UB). If anyone wants to deep-dive into how to avoid it in this case I recommend (the question/answers cover both C and C++ and even differences between the different versions of the C standard): http://stackoverflow.com/questions/13150449/efficient-unsigned-to-signed-cast-avoiding-implementation-defined-behavior – Klas Lindbäck Sep 15 '16 at 07:55
1

Do you want sign-extension or not? If you want sign-extension you have to go though a signed char first. If not then just use the implicit conversion using assignment or initialization:

unsigned char x = 0xfe;
int y = (signed char) x;
int z = x;
printf("x = %hhx, y = %08x, z = %08x\n", x, y, z);

The above code should print

x = fe, y = fffffffe, z = 000000fe
Some programmer dude
  • 400,186
  • 35
  • 402
  • 621
0
  1. Is there simple and efficient converting?

    // To convert a value to char and then to int
    // As a function or macro
    #define c2int(x) ((int)(char)(x))
    int c2int(char x) { return x; }
    
    // To convert a value to signed char and then to int
    #define c2int(x) ((int)(signed char)(x))
    int c2int(signed char x) { return x; }
    
  2. How to determine the signed expansion when simply convert char to int?

    No need for special code, see above. C does this for you.

  3. How to avoid the above warning?

    Insure shifts are less than the bit width.
    Avoid shifting into the sign bit.

chux - Reinstate Monica
  • 143,097
  • 13
  • 135
  • 256