I know from this question that var i = 1
and int i = 1
are exactly the same in IL. The compiler simply replaces var
with the actual type at compile time. But, are there any instances where var
could cause problem behavior (maybe the compiler guesses the wrong type?)

- 1
- 1

- 29,338
- 17
- 103
- 134
-
6Your timing is impeccable for today's blog from Eric Lippert. http://blogs.msdn.com/b/ericlippert/archive/2011/04/20/uses-and-misuses-of-implicit-typing.aspx – Anthony Pegram Apr 21 '11 at 00:14
-
Wow, I couldn't have even done that if I tried :) – Chris Laplante Apr 21 '11 at 00:15
-
5And I don't believe the compiler will *guess the wrong type.* It might *infer a type you didn't intend*, but that's not the same. Consider the perfectly legal `decimal foo = 10; decimal bar = 4; decimal baz = foo / bar;` In the code, `baz` will very clearly be 2.5. Remove the explicit typing, and now `foo` and `bar` infer to be `int`. `baz` is now also an `int` with a value of 2. – Anthony Pegram Apr 21 '11 at 00:16
-
2@Anthony: I think that deserves to be an answer :) – Chris Laplante Apr 21 '11 at 00:17
-
This foo-bar example is true. Just ran it in VS2010. I would never think of such situation myself! =) – Andrei Apr 21 '11 at 00:23
-
possible duplicate of [Use of "var" type in variable declaration](http://stackoverflow.com/questions/3658407/use-of-var-type-in-variable-declaration) – LukeH Apr 21 '11 at 00:46
5 Answers
I don't believe the compiler will guess the wrong type. However, it might infer a type you didn't intend, but that's not the same.
Consider the perfectly legal
decimal foo = 10;
decimal bar = 4;
decimal baz = foo / bar;
In the code, baz
will very clearly be 2.5. The integer literals will be converted to decimals prior to being stored and then the math takes place on the decimal values. Remove the explicit typing and the result is different.
var foo = 10;
var bar = 4;
var baz = foo / bar;
Now everything infers to int and baz
is 2 because now the math is taking place with integers.
So, yes, code semantics could theoretically change if you introduce var
where it was not before. So the key is to understand what type inference is really going to do with your code and if you want something to be a decimal (or any specific type X), declare it in such a way that it will be. For type inference, that would be
var foo = 10m;

- 123,721
- 27
- 225
- 246
-
I wonder why that happens? Why would they choose to automatically cast to int instead of keeping the precision? – Chris Laplante Apr 21 '11 at 00:26
-
@Simple, because type inference is happening independently on `foo`, `bar`, and `baz.` The semantics are clear, 10 is an integer literal. Therefore, `foo` infers to `int`. The same analysis occurs for `bar`. `baz`, being the result of an operation on two ints with clearly defined behavior, is also an `int`. (This being an extremely simplified explanation.) – Anthony Pegram Apr 21 '11 at 00:29
-
1For more on the actual process of type inference in C#, you may want to check out Section 7.5.2 from the language specification. http://www.microsoft.com/downloads/en/details.aspx?familyid=DFBF523C-F98C-4804-AFBD-459E846B268E&displaylang=en#QuickDetails – Anthony Pegram Apr 21 '11 at 00:34
-
Although your example makes a good point, it would not come as a surprise to a programmer who wrote `var foo = 10;` that `foo` was not a `decimal`. – Rick Sladkey Apr 21 '11 at 01:01
-
@Rick, sure, it might not trip you or me, but my answer is written primarily thinking of a maintenance programmer changing existing, explicit code to something using type inference without fully thinking it through or performing adequate testing to ensure inputs and outputs are as expected. – Anthony Pegram Apr 21 '11 at 01:10
-
@Anthony: Understood. By coincidence I read your comment on Eric's "var" blog earlier today so I know where you stand on "var"! – Rick Sladkey Apr 21 '11 at 01:20
No, I don't think so. The only time it can't figure things out is if you try to do something like
var product = null;
Which makes sense, and in this case you get a compile error.

- 4,931
- 26
- 34
It won't cause any "issues", regarding code however you could have a regression issue...
IObject1
{
void DoSomething();
}
IObject2
{
void DoSomething();
}
var result = SomeMethod();
result.DoSomething();
Now, if SomeMethod returned IObject1 and then was changed to return IObject2 this would still compile. However, if you expect DoSomething to execute particular code then this could potentially be a regression issue.
However, if you have
IObject1 result = SomeMethod();
If the return type of SomeMethod is changed you'll know it right away.

- 12,834
- 2
- 50
- 72
In addition to the ambiguous:
var x = null;
the compiler will also not infer the type of overloaded method groups:
var m = String.Equals;
nor will it infer the type of lambda expressions, which can be either Func<>
or Expression<Func<>>
:
var l = (int x) => x + 1;
All that said, Anthony is right: the compiler will never do the wrong thing, though it might not do what you expect. When in doubt, hover over var
in VS to see the static type.

- 75,175
- 8
- 100
- 122
var x = 0
x = 0.10
Cannot convert source type 'double' to target type 'int'
An example:
double x = 0; //to initialize it
switch (something) {
case condition1:
x = 0.1;
break;
case condition2:
x = 0.2;
break;
}
Using var
instead of double
will give a compiler error.

- 29,016
- 22
- 84
- 124