4

I have a buffer like char array like this:

char buf[4];
buf[0] = 0x82;
buf[1] = 0x7e;
buf[2] = 0x01;
buf[3] = 0x00;

I would now like to read char two and three together as a 16Bit unsigned integer in big endian. How do I do this with C(++) standard tools?

Currently I would only know the manual solution:

int length = but[3];
length += but[2] << 8;

This would be easy for 16Bit integers but I need also to parse 32Bit integers which would make things a bit difficult. So is there a function from the standard lib which does this for me?

Bodo

bodokaiser
  • 15,122
  • 22
  • 97
  • 140

3 Answers3

6

You can use ntohs and ntohl (on a little endian system):

#include <iostream>
#include <cstring>
#include <arpa/inet.h>
int main(){
    char buf[4];
    buf[0] = 0x82;
    buf[1] = 0x7e;
    buf[2] = 0x01;
    buf[3] = 0x00;
    uint16_t raw16;
    uint32_t raw32;
    memcpy(&raw16, buf + 2, 2); 
    memcpy(&raw32, buf    , 4); 
    uint16_t len16 = ntohs(raw16);
    uint32_t len32 = ntohl(raw32);
    std::cout << len16 << std::endl;
    std::cout << len32 << std::endl;
    return 0;
}

Or you can swap the bytes around and cast it to the appropriate type instead of shifting.

perreal
  • 94,503
  • 21
  • 155
  • 181
0

You can use a union :

union conv {
    char arr[4];
    short value16[2];
    int value32;
};

conv tmp;
tmp.arr[0] = buf[0];
tmp.arr[1] = buf[1];
tmp.arr[2] = buf[2];
tmp.arr[3] = buf[3];

Or memcpy :

memcpy(&length, buf, sizeof(length))
BenjaminB
  • 1,809
  • 3
  • 18
  • 32
-2

atoi. is more then likely the simpliest fix.

int iResult = atoi(buf);
Zanven
  • 144
  • 6
  • `atoi()` will convert a string like "123" to the value 123. It won't convert the characters 0x01 0x02 to 1x256 + 2 = 258, which is what the question is asking. – Adam Liss Apr 13 '13 at 15:47