If I divide 150 by 100, I should get 1.5. But I am getting 1.0 when I divided like I did below:
double result = 150 / 100;
Can anyone tell me how to get 1.5?
If I divide 150 by 100, I should get 1.5. But I am getting 1.0 when I divided like I did below:
double result = 150 / 100;
Can anyone tell me how to get 1.5?
try:
double result = (double)150/100;
When you are performing the division as before:
double result = 150/100;
The devision is first done as an Int and then it gets cast as a double hence you get 1.0, you need to have a double in the equation for it to divide as a double.
Cast one of the ints to a floating point type. You should look into the difference between decimal and double and decide which you want, but to use double:
double result = (double)150 / 100;
Make the number float
var result = 150/100f
or you can make any of number to float by adding .0
:
double result=150.0/100
or
double result=150/100.0
double result = (150.0/100.0)
One or both numbers should be a float/double on the right hand side of =
If you're just using literal values like 150 and 100, C# is going to treat them as integers, and integer math always "rounds down". You can add a flag like "f" for float or "m" for decimal to not get integer math. So for example result = 150m/100m
will give you a different answer than result = 150/100
.