I'm learning Javascript at the moment and the particular lesson that I'm on right now is showing how to turn 1.075 into 7.5% for display purposes.
the math looks as follows:
(1.075 - 1) * 100
and this is displaying in the results as 7.499999999999996. Why in the world is it calculating like this in Javascript? Every calculator that I used to do the exact same math came up as 7.5, as it should. I didn't even need a calculator to realize something was odd, but I wanted to run the same numbers on as many calculators as I could just to reassure myself. I'm actually shocked that the guy doing the tutorials didn't say a single thing about that, other than how to fix it to display only 2 decimal places, because this just seems so odd. So what be going on? I'm crazy curious about what's going on here.
Thanks!
I want to apologize to everyone that was obviously very upset by my asking a duplicate question, and decided to downvote me for it. I would have been just as well searching for an answer, only I don't even know how I could have began to query for such a question.
To the people that answered, and linked me to another article answering my question; thank you very much, and I apologize for the duplicate :)