-1
#include<stdio.h>
int main()
{
int a;
a=100000000;
printf("%d",a);
return(0);
}

//why value of a gets printed even if it larger than the range of int?

Veve
  • 6,643
  • 5
  • 39
  • 58
Satyam Garg
  • 156
  • 9

4 Answers4

3

because range of int type in c is from -32767 to 32767

You have false assumptions there.

The range of type int is not fixed. An int is not necessarily 16 bits.

The C standard doesn't define a fixed range for int. It requires a conforming implementation supports at least the range -32767 to 32767 for int. But an implementation can support higher ranges. On most systems, an int is 32 bits wide.

So there's nothing unexpected in your output. If you want to know the exact limits then you can use the macros INT_MIN and INT_MAX from <limits.h>.

Relevant C-FAQ: How should I decide which integer type to use?

P.P
  • 117,907
  • 20
  • 175
  • 238
2

That number fits on a 32 bit integer, which can hold up to ‭2,147,483,647‬ (signed).

Cecilio Pardo
  • 1,717
  • 10
  • 13
1

The size of "int" can vary across different CPU architectures. If you want to be absolutely sure how large your integer is use:

int32_t a;  //this is a 32bit signed integer
uint16_t b; //16 bit unsigned integer

instead.

0

In C the size of an integer can be 2,4 or even 8 bytes. This can depend on the compiler that you are using and also on your system.

  1. If your processor is 16-bit(of course this is ancient times) int will be 2 bytes.
  2. For a 32-bit processor it will be mostly 4 bytes.
  3. For a 64-bit processor it can even be 8 bytes.

The above was true for some extend for old compilers.

But this is not always the case. Cross compilation is required (among other things) to get around the host CPU's int being different from the int used on the target CPU. This can cause problems when porting code for use on multiple platforms. Most modern C compilers will allow the size of integers to be altered by the user.

Isuru
  • 430
  • 5
  • 21
  • This is not always the case. Cross compilation is required (among other things) to get around the host CPU's int being different from the int used on the target CPU. This can cause problems when porting code for use on multiple platforms. Most modern C compilers will allow the sizes of integers to be altered by the user. –  Jan 29 '16 at 13:38