I was showing my cousin some of the beginner code that I was doing 6 months ago when I started coding and I encountered something strange that I can't explain to myself now that I know more about coding.
int countLight = 2;
int countModerate = 1;
int countStrong = 1;
int countVeryStrong = 1;
int count = countLight + countModerate + countStrong + countVeryStrong;
double percentLight = countLight * 1.0 / count * 100;
double percentModerate = countModerate * 1.0 / count * 100;
double percentStrong = countStrong * 1.0 / count * 100;
double percentVeryStrong = countVeryStrong * 1.0 / count * 100;
Console.WriteLine($"Light: {percentLight:F2}%");
Console.WriteLine($"Moderate: {percentModerate:F2}%");
Console.WriteLine($"Strong: {percentStrong:F2}%");
Console.WriteLine($"Very Strong: {percentVeryStrong:F2}%");
The thing I am wondering about is the "multiplied by 1.0" part. When I do the math on a paper it doesn't matter if I will multiply by 1.0 or I will not, I will get the same answer. This is what I get with letting the 1.0 stay - the actual answer(the code is correct):
Light: 40.00%
Moderate: 20.00%
Strong: 20.00%
Very Strong: 20.00%
When I remove the "1.0" I get this:
Light: 0.00%
Moderate: 0.00%
Strong: 0.00%
Very Strong: 0.00%
For example : 3 * 1.0 = 3; 3 = 3; there should be no difference in the result, but here there is. I would be glad if someone can explain this to me.