0

I am writing a bit of C code in which I am presented with following things:

typedef uint8_t BYTE;
typedef int8_t SBYTE;
typedef uint16_t WORD;
typedef uint32_t DWORD;
typedef uint64_t QWORD;

I have a function having definition as QWORD exp_d(const DWORD *in).

And I have a pointer to a QWORD as QWORD* input. Being a QWORD, it is of 64 bits. Now, I want to send the least-significant 32 bits of input to function exp_d. What I am doing is,

QWORD expansion=exp_d(((DWORD*)input)+1);

I think, input is a QWORD*, so first typecasting it to a DWORD*, and then incrimenting it by 1 (to get to next DWORD, i.e. least significant 32 bits of QWORD) should do the thing. However, when I pass such value to exp_d, I get the most significant 32 bits of input rather than least significant 32 bits as expected.

Where am I going wrong?

tigerden
  • 728
  • 2
  • 11
  • 28

2 Answers2

1

Use the numeric value, if the type of the input is QWORD:

(uint32_t) ((*input) & 0xffffffff)
perreal
  • 94,503
  • 21
  • 155
  • 181
0

If function exp_d can't be modified, then you might need to define another function which return the pointer to lower 32 bits from 64 bits pointer. Inside function, return address according to your system endianness.

DWORD * GetDwordFromQword(const QWORD *input)
{
    if(BigEndian()) {
        return (DWORD*)input+1;
    } else {
        return (DWORD*)input;
    }
}

BigEndian() definition can refer here Detecting endianness programmatically in a C++ program

Community
  • 1
  • 1
yongzhy
  • 979
  • 8
  • 18