In the following code, why does the compiler evaluates result variable as 3(integer) but not as 3.5(float or double)?
void Main()
{
var result = 7/2;
Console.WriteLine(result);
}
Thanks
In the following code, why does the compiler evaluates result variable as 3(integer) but not as 3.5(float or double)?
void Main()
{
var result = 7/2;
Console.WriteLine(result);
}
Thanks
Because, if you divide one int
by another, the result is an int
. That's specified here.
When you divide two integers, the result is always an integer. For example, the result of
7 / 3
is2
.
Then var
just uses the expression type to create an int
because that's what you've told it you want (albeit implicitly). If you want a double
, you need to force the type of the expression to a double:
var result = 7.0 / 2;
or, if you're using int
variables where you can't just tack on a .0
:
int seven = 7;
int two = 2;
var result = (double)seven / two;
Because you are doing Integer division.
You need to convert one of the value to float/double to get the expected result.
Try This:
void Main()
{
var result = 7/2.0; //convert 7 or 2 to double
Console.WriteLine(result);
}
Operators like /
are also like functions. Signature for int/int
is
public static int operator /(int numerator, int denominator)
{
//
}
So the return value is int
.
As others have pointed out, you're getting 3 rather than 3.5 because you're performing integer division. What most answers have implied but not stated explicitly is why it's integer division. It's integer division because both operands are integers. They are both integers because 7 and 2 are integer literals. If you add a decimal point to either literal, it would become a double literal, resulting in double division being performed rather than integer division. Double division is used when dividing a double by an int or an int by a double.