I have a text file that I have to read and parse. each line contains a HEX value in ASCII format. For example:
0100002c
0100002c
80000000
08000000
0a000000
After conversion to signed 32bit integer, I have to check the bits in the following fashion:
bit #31 => should result in decimal 0 or 1
bit #30 to 23 => should result in decimal 0 to 10
bit #22 to 0 => should result in a signed decimal number
I assign the raw int32 to the following struct/union (I only set the raw
part):
typedef struct DATA{
union {
int32_t raw;
struct {
int32_t data:24;
uint8_t operation:7;
uint8_t binop:1;
};
} members;
} data_t;
Now the problem is, on a Linux machine and compiling with GCC (I tried 4.8 and 5.4) I get the correct results using the following function:
void vm_data_debug(data_t* const inst, int const num) {
printf("DEBUG DATA #%d => "
" RAW: %-12d"
" BINCODE: %-1d"
"\tOPCODE: %-1d"
"\tDATA: %-10d", num, inst->members.raw,
inst->members.binop, inst->members.operation , inst->members.data);
printf("\tBITS: ");
vm_data_print_raw_bits(sizeof inst->members.raw, &inst->members.raw);
}
And here is the result of the example ASCII on the top of the question on a Linux machine, fine and dandy!
DEBUG DATA #0 => RAW: 16777260 BINCODE: 0 OPCODE: 1 DATA: 44 BITS: 00000001000000000000000000101100
DEBUG DATA #1 => RAW: 16777260 BINCODE: 0 OPCODE: 1 DATA: 44 BITS: 00000001000000000000000000101100
DEBUG DATA #2 => RAW: -2147483648 BINCODE: 1 OPCODE: 0 DATA: 0 BITS: 10000000000000000000000000000000
DEBUG DATA #3 => RAW: 134217728 BINCODE: 0 OPCODE: 8 DATA: 0 BITS: 00001000000000000000000000000000
DEBUG DATA #4 => RAW: 167772160 BINCODE: 0 OPCODE: 10 DATA: 0 BITS: 00001010000000000000000000000000
Now, using the exact same code on a Windows machine (same machine that I run Linux on) I get a very different result (I have compiled with both MinGW and MSVC2015):
DEBUG DATA #0 => RAW: 16777260 BINCODE: 1 OPCODE: 77 DATA: 44 BITS: 00000001000000000000000000101100
DEBUG DATA #1 => RAW: 16777260 BINCODE: 1 OPCODE: 77 DATA: 44 BITS: 00000001000000000000000000101100
DEBUG DATA #2 => RAW: 2147483647 BINCODE: 1 OPCODE: 77 DATA: -1 BITS: 01111111111111111111111111111111
DEBUG DATA #3 => RAW: 134217728 BINCODE: 1 OPCODE: 77 DATA: 0 BITS: 00001000000000000000000000000000
DEBUG DATA #4 => RAW: 167772160 BINCODE: 1 OPCODE: 77 DATA: 0 BITS: 00001010000000000000000000000000
So the question is, where does this difference come from? What should I do to make it consistent between both Windows and Linux?
I have checked this question but it does not fix it for me, having the union members all signed or unsigned still does not work on Windows.