6

sizeof(int) shows 4 on my Dev Cpp even though its running on a 64 bit machine. Why doesn't it consider the underlying HW and show 8 instead? Also, if I compiling environment also changes to 64 bit ( Does a 64 bit compiler makes sense in the first place?! ), would size of int change then?

Are there any standards which decide this?

Pavan Manjunath
  • 27,404
  • 12
  • 99
  • 125

2 Answers2

18

Taken from http://en.wikipedia.org/wiki/64-bit (under 64-bit data models)

There are various models, Microsoft decided that sizeof(int) == 4, some (a few) others didn't.

HAL Computer Systems port of Solaris to SPARC64 and Unicos seem to be the only ones where sizeof(int) == 8. They are called ILP64 and SILP64 models.

The true "war" was for sizeof(long), where Microsoft decided for sizeof(long) == 4 (LLP64) while nearly everyone else decided for sizeof(long) == 8 (LP64).

Note that in truth it's the compiler that "decides" which model to use, but as written in the wiki

Note that a programming model is a choice made on a per-compiler basis, and several can coexist on the same OS. However, the programming model chosen as the primary model for the OS API typically dominates.

xanatos
  • 109,618
  • 12
  • 197
  • 280
  • 1
    And the reason the OS API dominates is that if you want to call into a dll or similar (including system calls), then caller and callee need to agree on the size of the parameters. A C implementation that can make system calls is usually more useful than one that can't, so people use the OSes ABI unless there's a reason not to. The fact that 64 bit Windowses can run 32 bit executables is thanks to them (among other things) providing 32 bit versions of everything the program needs to call. – Steve Jessop Mar 13 '12 at 18:04
  • Making `int` 64 bits, and `char` 8 bits, would mean that you couldn't have both a 16-bit and a 32-bit predefined integer type (unless you resort to *extended integer types*, but most compiler implementers haven't done that). – Keith Thompson Aug 22 '13 at 20:32
0

While the compiler ultimately decides the size of an integer, it is usually inherited as the size of the CPU registers, that would hold the integer. Many processors support 32-bit/64-bit register arithmetic, and the compiler settings determine which mode is invoked. Insofar as sizeof(long), etc., the only guarantee is sizeof(long) >= sizeof(short).

MadHacker
  • 11
  • 2