2

I was just wondering. For the purpose of what I'm doing, it isn't important to go into detail, but the thing is a char is too short and an int is too long. I don't need 32 bits. 24 bits sounds perfect. Why in languages there isn't a a primitive type where it's 3 bytes. There's 1 byte. There's 2 bytes. There's 4 bytes. Why skip the 3 bytes? Is it some hardware issue where 3 bytes causes problems? or is it just convention and people just stuck with it?

Maroun
  • 94,125
  • 30
  • 188
  • 241
  • 1
    It's not a convention.. it's more complicated than that. The duplicate question has answers that might help you. – Maroun Apr 24 '15 at 08:32
  • 1
    Why you don't use `struct { char low; char mid; char high; };` ? – senfen Apr 24 '15 at 08:39
  • Just because usual hardware doesn´t support that (without convertig it to 4byte first) – deviantfan Apr 24 '15 at 08:43
  • 4
    There are certainly 24Bit microprocessors (e.g. DSP) around, where a 24bit word is a usual int type :-) – Valentin H Apr 24 '15 at 08:45
  • 1
    Yes, there is no other way. If it is not possible - we will get 4 bytes, but if it is possible (we are on strange architecture) we will get 3 bytes. There is no better solution. And in code he will be able to treat this structure as 3 bytes, sizeof will give 3 bytes. – senfen Apr 24 '15 at 08:46
  • The main difference between 3 and 1, 2 or 4, is that 3 is not a power of 2. Thus it plays a little less well in terms of alignement, addressing etc. – Drax Apr 24 '15 at 08:46
  • In C, why not? http://stackoverflow.com/q/17834838/995714 There are even "stranger" systems with 18, 27, 36 or 60-bit words http://stackoverflow.com/q/6971886/995714 – phuclv Apr 24 '15 at 10:45

0 Answers0