The short answer is: computers represent numbers in binary, so they can't represent all base 10 fractions perfectly as JavaScript numbers.
The (slightly) longer answer is that, because JavaScript numbers are 64-bit floating-point (equivalent to the double
type in Java, C#, etc.), as Wikipedia describes here, there is a limited number of significand bits. For this reason, the precision of this base-2 number is limited.
As an analogy, consider representing the fraction 1/3 in base 10. Say that you only have so many digits to use. That means that you can never ever ever represent 1/3 exactly in base 10, because 1/3 requires an infinite number of digits to represent in base 10. Similarly, you can never represent 1/10 perfectly in a finite number of bits, because 1/10 requires an infinite number of bits to represent exactly. What you're seeing here is a fraction (58/10) that a computer can't represent exactly in a limited number of bits, so the computer is coming as close as it can.