So, I'm just starting out learning and I'm aware that rounding-error questions are the most common on here. But I am unable to find one that gives a clear answer about why this particular thing happens, or rather, how will I know what to look for? I understand that it happens because of the way the variable are stored, and that there's a limit for how much decimal places they can hold, but I'm unsure why its different for each digit. Other than that explanation, I can't find anything further.
My textbook warns me to be careful with rounding errors, and I do know how to correct them to the decimal place that I want. But still, I want to know how we can brace ourselves for what numbers it is likely to happen. We can ask the console to print to check and make sure that our numbers are correct, during the programming process, but is there some way to 'detect' it before it even happens or be weary of which numbers do what?
Some numbers have no rounding error. Example: 10.02 * 100 = 1002.
But 10.03 * 100 = 1002.9999999999999
10.04 * 100 = 1003.9999999999999
10.05 * 100 = 1005.0000000000001
If it has to do with binary, why isn't 4 safe?