-2

I was checking the size of short , int and long through coding . My expectation was size of all data type modifiers will differ . Suprisingly , size of long and int is same which is 4 byte . I have provided the code and output .I am using 64 bit windows OS and gcc compiler ( downloaded from https://www.msys2.org ). I have to tried to search reason behind it but unfortunately either i was not able to understand or reason is little vague.

#include <stdio.h>

int main()
{
 printf("Size of short is %d",sizeof(short));
 printf("\nSize of int  is %d",sizeof(int));
 printf("\nSize of long  is %d",sizeof(long));
}

output:


Size of short is 2
Size of int  is 4
Size of long  is 4

user207421
  • 305,947
  • 44
  • 307
  • 483
  • Type sizes depend on the platform. I get `sizeof(long) == 8` on my linux (which is the same size as `long long`). – HolyBlackCat Oct 16 '21 at 09:12
  • @HolyBlackCat , Thanks for the reply. Why in my system it is showing 4 bytes ? I want to know the proper and very specific reason behind it . – novice programmer Oct 16 '21 at 09:16
  • 2
    "My expectation was size of all data type modifiers will differ". It is totally unfounded. – n. m. could be an AI Oct 16 '21 at 09:18
  • 1
    The proper and very specific reason is "it just happened that way historically". – n. m. could be an AI Oct 16 '21 at 09:19
  • @n.1.8e9-where's-my-sharem. Size of int and long is equal – novice programmer Oct 16 '21 at 09:20
  • The proper and very specific reason is "it just happened that way historically". I didn't got this . – novice programmer Oct 16 '21 at 09:23
  • BTW if you want a 64-bit integer then `long long` is guaranteed to be >= 64 bits in size. – mediocrevegetable1 Oct 16 '21 at 09:24
  • 1
    `%zu` is the correct `printf` format string for values of type size_t, not `%d`. – Paul Hankin Oct 16 '21 at 09:24
  • 1
    Nowadays it's generally better to `#include ` and use types from there (e.g., `int32_t`, `uint64_t`, `int_least32_t`, `int_fast16_t`, etc.). – Arkku Oct 16 '21 at 09:26
  • C is a languages of compromises. `int` is the work-horse/"default" integer size. `short` is for little numbers, `long` for larger ones. It is all fuzzy to accommodate the variety of machines and coding goals over the past 50 years - and the next 50. It you want specific widths, research `intN_t` types. – chux - Reinstate Monica Oct 16 '21 at 09:35
  • @mediocrevegetable1, Thanks for the reply. I have gone through the link , In that thread the but the person is using VC++ and the accepted answer is the MS had chosen to make long 32 bits for some historical reason but here i am using the gnu gcc ,I don't think that accepted answer will suit here – novice programmer Oct 16 '21 at 09:38
  • @mediocrevegetable1 "BTW if you want a 64-bit integer then long long is guaranteed to be >= 64 bits in size" why long is not 64 bit itself, It looks quite redundant to use long two time for getting 64 bit memory whereas single long not serving the purpose. – novice programmer Oct 16 '21 at 09:44
  • @noviceprogrammer because they are different types with different requirements. `int` must be >= 16 bits, `long` must be >= 32 and `long long` must be >= 64. Also the OP in [this question](https://stackoverflow.com/questions/22344388/size-of-long-int-and-int-in-c-showing-4-bytes?lq=1) is using GCC on a 64-bit Windows system they too are getting `sizeof (long) = 4`. – mediocrevegetable1 Oct 16 '21 at 09:50
  • You aren't engaged in [tag:compiler-construction] here. Don't tag indiscriminately. – user207421 Oct 16 '21 at 09:54
  • @mediocrevegetable1 , I might be looking absurd, as i am not able to understand the answer. – novice programmer Oct 16 '21 at 09:58
  • @mediocrevegetable1, I already mentioned that link has not addressed my query but still you have closed – novice programmer Oct 16 '21 at 10:03
  • this is about C++ but it applies to C as well and is a better dupe: https://stackoverflow.com/questions/589575/what-does-the-c-standard-state-the-size-of-int-long-type-to-be – bolov Oct 16 '21 at 10:08
  • @noviceprogrammer the comment was generated automatically when I voted to close the question, sorry for the misunderstanding. However, I do still stand by my close vote; I'm quite sure this is related to Windows itself, not any specific compiler. Plus, the question also has the standard answer that "`long` can be 32 bits because the standard allows it", which may not be the answer you want but it is what it is :/ – mediocrevegetable1 Oct 16 '21 at 10:10
  • The size of each type is determined by the *minimum range of values* they must be able to represent. An `int` must be able to represent *at least* the range `-32767..32767`, meaning it must be *at least* 16 bits wide. However, it’s common practice for `int` to also be the same size as the native word size. Once 32-bit machines became common, it was also common for `int` to be 32 bits wide, but it didn’t happen all at once. I worked in code that had to run on classic MacOS and Windows 3.1, and MPW used 32-bit `int` while VS used 16-bit `int`. That was a fun afternoon. – John Bode Oct 16 '21 at 12:59

1 Answers1

1

The size of integers totally depends on the system or machine you're working on. For example, in my computer the sizeof(int)==4, then the sizeof(short)==2 and the sizeof(long)==8. But, that is always not guaranteed, the only guaranteed is:

sizeof(short)<=sizeof(int)<=sizeof(long).

Thus, it is possible that the size of your integers has the same values maybe the short and int or the int and long are equal to each other or all are not the same values. This is totally depends on the system you're working on.

AbuKiks
  • 11
  • 2
  • How system determines 4 bytes only for long not other sizes ? – novice programmer Oct 16 '21 at 10:00
  • 1
    @noviceprogrammer: The C standard specifies that a `char` (signed) must be able to represent *at least* the range `-127..127`, so it must be *at least* 8 bits wide. `short` and `int` must be able to represent *at least* the range `-32767..32767`, meaning they must be *at least* 16 bits wide. `long` must be able to represent the range `-2147483647..2147483647`, so it must be *at least* 32 bits wide. These are *minimum* ranges, and an implementation may define types to store wider ranges. – John Bode Oct 16 '21 at 13:12
  • @noviceprogrammer: C is a product of the early 1970s when word sizes were not uniform across systems, so it didn’t mandate specific sizes, only what types had to be able to represent. – John Bode Oct 16 '21 at 13:13