-3

The number of bits in an integer in C is compiler and machine dependent. What is meant by this? Does the number of bits in an int vary with different C compilers and different processor architecture? Can you illustrate what it means?

Daniel Daranas
  • 22,454
  • 9
  • 63
  • 116
user319280
  • 757
  • 1
  • 8
  • 14
  • 7
    Exactly, it does. –  Oct 08 '13 at 08:23
  • 2
    It depends upon different processor architecture, like 32 bit CPU, has 4 bytes, or 64 bits CPU, has 8 bytes.. etc. – Ishmeet Oct 08 '13 at 08:24
  • 3
    @OP It refers to **the size** of the integer. For example in x86 `sizeof(long)` is 4 while on amd64 `sizeof(long)` is typically 8. – cnicutar Oct 08 '13 at 08:24
  • This is a (dupiclate) question about the rules of a language, C, about one of its types, `int`; not about people making computers with more capacity so that we the users can store larger stuff in them. See [this answer](http://stackoverflow.com/a/11438985/96780) from the duplicate question. – Daniel Daranas Oct 08 '13 at 08:29
  • Please answer your homework or test questions yourself. – Jim Balter Oct 08 '13 at 08:38
  • See also the related question [gcc - Size of integer in C](http://stackoverflow.com/q/7180196/96780) and its [accepted answer](http://stackoverflow.com/a/7180229/96780). – Daniel Daranas Oct 08 '13 at 08:42
  • See also the related question [Does the size of an int depend on the compiler and/or processor?](http://stackoverflow.com/q/2331751/96780) and its [accepted answer](http://stackoverflow.com/a/2331835/96780). – Daniel Daranas Oct 08 '13 at 08:43

2 Answers2

2

This wikipedia article gives a good overview: http://en.wikipedia.org/wiki/Word_(data_type)

Types such as integers are represented in hardware. Hardware changes, and so do the size of certain types. The more bits in a type, the larger the number (for integers) or more precision you can store (for floating-point types).

There are some types which specifically specify the number of bits, such as int16.

Joe
  • 46,419
  • 33
  • 155
  • 245
  • Down-voting without an explanation isn't very constructive. – Joe Oct 08 '13 at 08:27
  • The place were Wikipedia explains sizes of types in C is http://en.wikipedia.org/wiki/C_data_types#Basic_types. – Daniel Daranas Oct 08 '13 at 08:31
  • The questioner was asking for an explanation about the fact that sizes can be different, not necessarily a list. I think my link was more useful in connection with the question. – Joe Oct 08 '13 at 08:34
  • I think the specification of types in a language is better understood by reading the rules of the language itself. But of course the OP will benefit from reading both links. – Daniel Daranas Oct 08 '13 at 08:36
2

It means exactly what it says and what you said in your own words.

For example, on some compilers and with some platforms, an int is 32 bits, on other compilers and platforms an int is 64 bits.

I remember long ago when I was programming on the Commodore Amiga, there were two different C compilers available from two different manufacturers. On one compiler, an int was 16 bits, on the other compiler an int was 32 bits.

You can use sizeof to determine how many bytes an int is on your compiler.

Jesper
  • 202,709
  • 46
  • 318
  • 350