1

I'm quite puzzled by the following code:

typedef struct
{
    Uint16          first:8;   // has a size of 8 bit
    Uint16          second:8;  // has a size of 8 bit
    Uint16          third;     // has a size of 32 bit; what's wrong here?
} TSomeStruct;

I expected "third" to have a size of 16 bit instead of 32 bit. I'm sure the mistake must be on my part.

Background: It's not my code base and I'm performing verification on it. Since it's written for an embedded system and a proprietary compiler which I don't have I'm just generating the syntax tree with the "-fdump-translation-unit" option and perform my verification on that. But Uint16 should be 16 bits long in GCC as well so that shouldn't be the problem, right?

Lichtblitz
  • 213
  • 2
  • 10
  • 6
    What is `Uint16` typedef'ed to? And how did you determine the size? – Daniel Fischer Feb 21 '13 at 13:29
  • 3
    How did you check its size ? Did you perchance do a `sizeof` of the entire structure ? – cnicutar Feb 21 '13 at 13:30
  • 1
    `it's written for an embedded system` - what architecture was it written for? – Mike Feb 21 '13 at 13:41
  • what is `Uint16`? could you use the, imo, better `uint16_t`? – Andreas Grapentin Feb 21 '13 at 13:50
  • Oh boy, coming from C# with builtin types that have names like these I just assumed that it was a standard type in C as well. My bad. It was defined to typedef unsigned int Uint16; Is there any way to force gcc to use 16 bits instead of 32 bits for int? Otherwise the enums are getting too long even if I correct this typedef. – Lichtblitz Feb 21 '13 at 13:50
  • if, and only if, the stdint.h header has been ported for your architecture, you can use uint16_t to force 16 bit integers. otherwise, you have to read the docs of the compiler and the arch to find out what size of integers are available. – Andreas Grapentin Feb 21 '13 at 13:54
  • Btw. @DanielFischer: If you answer this question with your tip I will gladly accept it as the answer. – Lichtblitz Feb 21 '13 at 13:55

2 Answers2

2

Structure members a usually padded to a word size.

In order to force compiler to pack the structure there are different ways. I prefer the folloiwng:

struct __attribute__((__packed__)) packed_struct {
    unsigned char a;
    unsigned char b;
    unsigned char c;
};

In this case sizeof(packed_struct) will give you 3.

Unfortunately, you will pay for this by program performance.

Alex
  • 9,891
  • 11
  • 53
  • 87
  • 1
    Normally(1), padding is only inserted if necessary to make the next member suitably aligned. (1)In my limited experience. – Daniel Fischer Feb 21 '13 at 13:31
  • @DanielFischer This really depends on the compiler settings and the architecture compiled for. Usually compilers optimize for fast (simple) access of structure members. For architectures not supporting reading of half words the fastest way is to place members in words. Otherwise the compiler would have to insert instructions to shift and mask the sub-word values which is ineficcient in terms of runtime. – junix Feb 21 '13 at 13:46
  • 1
    @Alex: Maybe it's worth noting how to "persuade" the compiler to increase the density of a structure: By placing `#pragma pack(1)` (see http://stackoverflow.com/questions/3318410/pragma-pack-effect for details) in front of the structure you can force the compiler to stop inserting padding. Unfortunately this also introduces a performance penalty in terms of runtime. – junix Feb 21 '13 at 13:49
  • @junix, yes I know about this option, possibly I should mention it. – Alex Feb 21 '13 at 14:30
2

Uint16 is not a standard type in C, so its size depends on what it's typedefed to.

It may be typedefed to unsigned int because that is guaranteed to be 16 bits in the proprietary compiler for the embedded platform. But then it would become a 32-bit type when compiled with a current gcc for x86 platforms.

Is there any way to force gcc to use 16 bits instead of 32 bits for int?

I know of none. If you have access to the code, using short instead of int will very likely work. If stdint.h is available, uint16_t/int16_t will work (unless there is no 16-bit integer type on the platform).

Daniel Fischer
  • 181,706
  • 17
  • 308
  • 431
  • "`Uint16` is not a standard type in C, so its size depends on what it's typedefed to." Actually there is a high probability that it's exactly what it's name suggests. An unsigend integer of 16 bit width. It was pretty wide spread practice in embedded software to create fixed size typedefs long before C99 came up with this. – junix Feb 21 '13 at 14:34
  • @junix In this specific case, it was `typedef`ed to `unsigned int`, so it wasn't an unsigned 16-bit type when the OP used gcc. That's the whole reason for this question. – Daniel Fischer Feb 21 '13 at 14:37
  • Right. Didn't notice the comment of Lichtblitz regarding the `typedef unsigned int Uint16`. – junix Feb 21 '13 at 14:40