1

I was showing my cousin some of the beginner code that I was doing 6 months ago when I started coding and I encountered something strange that I can't explain to myself now that I know more about coding.

int countLight = 2;
int countModerate = 1;
int countStrong = 1;
int countVeryStrong = 1;
int count = countLight + countModerate + countStrong + countVeryStrong;

double percentLight = countLight * 1.0 / count * 100;
double percentModerate = countModerate * 1.0 / count * 100;
double percentStrong = countStrong * 1.0 / count * 100;
double percentVeryStrong = countVeryStrong * 1.0 / count * 100;

Console.WriteLine($"Light: {percentLight:F2}%");
Console.WriteLine($"Moderate: {percentModerate:F2}%");
Console.WriteLine($"Strong: {percentStrong:F2}%");
Console.WriteLine($"Very Strong: {percentVeryStrong:F2}%");

The thing I am wondering about is the "multiplied by 1.0" part. When I do the math on a paper it doesn't matter if I will multiply by 1.0 or I will not, I will get the same answer. This is what I get with letting the 1.0 stay - the actual answer(the code is correct):

Light: 40.00%
Moderate: 20.00%
Strong: 20.00%
Very Strong: 20.00%

When I remove the "1.0" I get this:

Light: 0.00%
Moderate: 0.00%
Strong: 0.00%
Very Strong: 0.00%

For example : 3 * 1.0 = 3; 3 = 3; there should be no difference in the result, but here there is. I would be glad if someone can explain this to me.

wohlstad
  • 12,661
  • 10
  • 26
  • 39
  • 2
    Multiplying by 1.0 converts an `int` value into a `double`. Without it you get integer division, where e.g. 1/3 == 0. – wohlstad Jul 09 '22 at 10:10

2 Answers2

2

In statements like this:

double percentLight = countLight / count * 100;

The expression on the right side is done using integer arithmetics, because all the values are intergers. If count is larger than countLight, then countLight / count will be 0 (integer division yields result without the fraction part), and multiplying by 100 will keep it a 0.

On the other hand in statements like this:

double percentLight = countLight * 1.0 / count * 100;

In order to calculate countLight * 1.0, countLight is converted to double to match 1.0. The result is simply countLight as a double and by multiplying by 100 you get the value you expected because floating-point arithmetic is applied.

You can achieve the same by casting, e.g.:

double percentLight = (double)countLight / count * 100;

Since we cast countLight to a double, the expression will have the same value as the previous one.

wohlstad
  • 12,661
  • 10
  • 26
  • 39
0

When in c#, when multiplying int by double you get a double, but when you multiply by int you get int, same for division.

For example -

10 * 0.1 / 4 = 2.5
10 / 4 = 2.0
Omri Attiya
  • 3,917
  • 3
  • 19
  • 35