3

when i was reading about implicitly typed variables, this question came in my mind. i couldn't find the answer on internet so decided to put it hare.

suppose i declare a variable using 'var' keyword.

var i = 10;

after compilation, i is compiled/treated as 'Integer' i.

now, my question is why 'i' is not compiled to 'short' as value of 'i' very much small to be fit into 'Short' data type; and why it is always compiled to 'Integer' ??

Iorn Man
  • 267
  • 8
  • 21

1 Answers1

6

Because the C# Specification 2.4.4.2 dictates that if a number literal has no decimal or suffix, it's a the smallest of int, uint, long, or ulong that can contain it. Naturally 10 fits Int32 so it is chosen.

The type of an integer literal is determined as follows:

  • If the literal has no suffix, it has the first of these types in which its value can be represented: int, uint, long, ulong.

The C# language designers decided that even though 10 can be fit in an Int16, usage of that variable type would be relatively rare; that Int32 usage would be the "status-quo".

Actually, if memory serves, there is no Int16 literal in C# anyway. You'd have to explicitly declare and assign a value to an Int16 to get one.


Interestingly, I looked at the compiled IL code and declaring a short s = 10 and int i = 10 actually generate the exact same IL... so now I'm wondering how shorts are managed; perhaps they are actually managed as 32 bit values in the CLI anyway. I'd love to find out if this is the case.

Chris Sinclair
  • 22,858
  • 3
  • 52
  • 93