I know this is a really basic question, but please explain why this is.
var number = 7 / 2; // number is 3
decimal number = 7 /2; // still number is 3
Why isn't the number 3.5? Why is it always casted to int? Because the numbers are both int? This works when you cast one of the numbers to decimal.