4

Possible Duplicate:
size of int, long, etc
Does the size of an int depend on the compiler and/or processor?

I'm not sure if similar questions have been asked before on SO (Atleast, I couldn't find any while searching, so thought of asking myself).

What determines the size of int (and other datatypes) in C. I've read it depends on the machine/operating system/compiler, but haven't come across a clear/detailed enough explanation on things like what overrides the other, etc. Any explanation or pointers will be really helpful.

Community
  • 1
  • 1
Raj
  • 4,342
  • 9
  • 40
  • 45
  • 1
    The C standard gives a minimum range of values for each type. The compiler is ultimately responsible for this. – andre Dec 07 '12 at 14:25
  • http://stackoverflow.com/questions/2331751/does-the-size-of-an-int-depend-on-the-compiler-and-or-processor is probably a better dupe. – Stephen Canon Dec 07 '12 at 14:37
  • [Does the size of an int depend on the compiler and/or processor?](https://stackoverflow.com/q/2331751/608639), [What does the C++ standard state the size of int, long type to be?](https://stackoverflow.com/q/589575/608639) – jww Feb 22 '18 at 08:52

4 Answers4

16

Ultimately the compiler does, but in order for compiled code to play nicely with system libraries, most compilers match the behavior of the compiler[s] used to build the target system.

So loosely speaking, the size of int is a property of the target hardware and OS (two different OSs on the same hardware may have a different size of int, and the same OS running on two different machines may have a different size of int; there are reasonably common examples of both).

All of this is also constrained by the rules in the C standard. int must be large enough to represent all values between -32767 and 32767, for example.

Stephen Canon
  • 103,815
  • 19
  • 183
  • 269
  • Suppose CPU register is of `16-bit`. Then it would be good to keep `16 bit size of int` or it can be in `multiple of 16`? I mean in most previous compiler int was of 16 bit. – Grijesh Chauhan Dec 07 '12 at 14:42
  • @GrijeshChauhan: the answer to that really depends on the goals of the system and the software that is intended to run on it. – Stephen Canon Dec 07 '12 at 14:46
  • Small correction. It's `-32768`. – P.P Dec 07 '12 at 15:12
  • 3
    @KingsIndian: no, it isn't; the C standard is quite clear on that point. If a 16-bit 2s complement representation happens to be used, it will be able to represent `-32768`, but that isn't required. – Stephen Canon Dec 07 '12 at 15:24
  • @StephenCanon Just checked the standard. It's indeed `-32767` for SHRT_MIN and INT_MIN. Thanks for the clarification. – P.P Dec 07 '12 at 15:31
4

int is the "natural" size for the platform, and in practice that means one of

  • the processor's register size, or

  • a size that's backward compatible with existing code-base (e.g. 32-bit int in Win64).

A compiler vendor is free to choose any size with ≥ 16 value bits, except that (for desktop platforms and higher) a size that doesn't work with OS' API will mean that few if any copies of the compiler are sold. ;-)

Cheers and hth. - Alf
  • 142,714
  • 15
  • 209
  • 331
1

The size of C data types is constrained by the C standard, often constraints on the minimum size. The host environment (target machine + OS) may impose further restriction, i.e. constraints on the maximum size. And finally, the compiler is free to choose suitable values between these minimum and maximum values.

Generally, it's considered bad practice to make assumptions about the size of C data types. Besides, it's not necessary, since C will tell you:

  • the sizeof-operator tells you an object's size in bytes
  • the macro CHAR_BITS from limits.h tells you the number of bits per byte

Hence, sizeof(foo) * CHAR_BITS tells you the size of type foo, in bits, including padding.

Anything else is just assumptions. Note that the host environment may as well consist of 10.000 Chinese guys with pocket calculators and a huge blackboard, pulling size constraints out of thin air.

Philip
  • 5,795
  • 3
  • 33
  • 68
0

SO does not know everything but Wikipedia, almost...
see Integer_(computer_science)

Note (b) says:
"The sizes of short, int, and long in C/C++ are dependent upon the implementation of the language; dependent on data model, even short can be anything from 16-bit to 64-bit. For some common platforms:
On older, 16-bit operating systems, int was 16-bit and long was 32-bit.
On 32-bit Linux, DOS, and Windows, int and long are 32-bits, while long long is 64-bits. This is also true for 64-bit processors running 32-bit programs.
On 64-bit Linux, int is 32-bits, while long and long long are 64-bits.
"

Francois
  • 2,005
  • 21
  • 39
  • (a) just posting a link isn't an answer (http://meta.stackexchange.com/questions/8231/are-answers-that-just-contain-links-elsewhere-really-good-answers); at best it's a comment. (b) that link doesn't actually answer the question that was asked, and the information in that table is borderline misleading to boot. – Stephen Canon Dec 07 '12 at 14:33