1

A short is at least 16 bits and a long is at least 32 bits, so what's the point of an int which can be either 16-bit or 32-bit?

PS: I'm talking about ANSI C here.

Sourav Ghosh
  • 133,132
  • 16
  • 183
  • 261
Bite Bytes
  • 1,455
  • 8
  • 24
  • 6
    `int` is (in most cases, this isn't possible for 8bit architectures) the *natural word size* of the machine, and therefore often the most performant. On IA32, integer arithmetics in 32 bit indeed can perform better than in 16bit. –  May 27 '17 at 14:57
  • @FelixPalmen, thank's, so it's preferable to use `int` even if you need only, say 8 bits variable, (in arithmetic calculations for example). – Bite Bytes May 27 '17 at 15:00
  • 2
    it depends. Maybe size is your bigger concern? But, as a rule of thumb, and apart from size-constrained systems: Use `int` whenever you need an integer of at least 16bit, but don't care about the actual bigger size. –  May 27 '17 at 15:02
  • @Bite Bytes values smaller than `int` are promoted to `int` for arithmetic calculations anyway (so that intermediate steps do not so easily overflow) so you may be giving the machine *more* work to do, not less. – Weather Vane May 27 '17 at 15:07
  • @WeatherVane is this a part of the standard? – Bite Bytes May 27 '17 at 15:11
  • 3
    Please see [section 6.3 here](http://www.open-std.org/jtc1/sc22/wg14/www/docs/n1256.pdf) and footnote 48. – Weather Vane May 27 '17 at 15:19
  • ...because it's like porridge. If it's too hot, or too cold, it's not so useful. – ThingyWotsit May 27 '17 at 15:46
  • 1
    by the way, the standard doesn't say `int` can only be 16 or 32 bits, only that `int` is at least 16 bits (and not larger than `long`) – UnholySheep May 27 '17 at 15:58
  • in the past the only type is int. There were not even `char` or `void` – phuclv May 27 '17 at 16:10
  • That's why [`memset` takes an `int` instead of a `char`](https://stackoverflow.com/q/5919735/995714). [The same to `strchr](https://stackoverflow.com/q/2394011/995714) – phuclv May 27 '17 at 16:16
  • Why the down vote? – Bite Bytes May 27 '17 at 16:26
  • @UnholySheep from K&R second edition I quote: **_"int will normally be the natural size for a particular machine. short is often 16 bits, long 32 bits, and int either 16 or 32 bits."_** – Bite Bytes May 27 '17 at 16:44
  • 1
    1. *"Normally"* does not mean that no other sizes are allowed, it's just that these two are the most common. 2. K&R is _not_ the standard – UnholySheep May 27 '17 at 16:54
  • @BiteBytes there are lots of architectures with [18-bit int, 24-bit int](https://stackoverflow.com/q/6971886/995714), [36-bit int](https://stackoverflow.com/q/2098149/995714) or even [60-bit int](https://en.wikipedia.org/wiki/Word_%28computer_architecture%29) – phuclv May 29 '17 at 03:04

1 Answers1

3

short, int and long are by definition three different Type specifiers, where, short int ranks lower than int which ranks lower than long int.

C standard only specifies the minimum (and comparative, for example, int cannot be wider than long) requirements, an implementation may choose to provide any other wider definition of the types, keeping the constraints alive.

Sourav Ghosh
  • 133,132
  • 16
  • 183
  • 261