0

I know this is a really basic question, but please explain why this is.

var number = 7 / 2; // number is 3
decimal number = 7 /2; // still number is 3

Why isn't the number 3.5? Why is it always casted to int? Because the numbers are both int? This works when you cast one of the numbers to decimal.

TuomasK
  • 1,849
  • 3
  • 13
  • 19
  • 3
    You are dividing two integers, the result is always an integer. – Steve Aug 30 '16 at 12:45
  • 1
    The denominator is 2 (change to 2.0) so the compiler truncates the fraction part of the results. – jdweng Aug 30 '16 at 12:45
  • The expression *7 / 2* (type `int` / `int`) will evaluate to the integer value 3 as it is using integer division. – Raktim Biswas Aug 30 '16 at 12:50
  • If you want a numeric literal to be treated as decimal, use the suffix `m` after one of the numbers that you're dividing (just as Rene and Manfred have pointed out) – Grizzly Aug 30 '16 at 12:51

2 Answers2

7

The literals 7 and 2 both are interpreted by the compiler as int. So the division is an integer division resulting in an int of value 3.

Using the var keyword, the compiler infers the correct type from the expression, leading to number in the first line being of type int.

In the second line, you explicitly declare number as decimal, so the int returned by 7 / 2 is casted to decimal.


If you want the result to be decimal, you should use the correct literal ending with m:

var number = 7/2m;
René Vogt
  • 43,056
  • 14
  • 77
  • 99
3

Because 7/3 is (int)7/(int)3. If you want to get a decimal, use 7m/2m

Manfred Radlwimmer
  • 13,257
  • 13
  • 53
  • 62