0

Have been wondering recently how browser console or node.js shell print the number always in a right precision. E.g. number 0.1 is not printed like something 0.1000000000000000055511151231257827021181583404541015625, but 1.234567890123 or likes are printed fully. I think it is using Number.toPrecision() (correct me if wrong). What is the default value used as a parameter, does it depends on the printed number itself?

Yuki
  • 3,857
  • 5
  • 25
  • 43
  • 2
    I think `0.1000000000000000055511151231257827021181583404541015625` is outside the precision of JavaScript numbers – evolutionxbox Mar 26 '21 at 16:03
  • @evolutionxbox but this is how unbounded printing would print 0.1. – Yuki Mar 26 '21 at 16:04
  • 1
    Does this answer your question? [How to deal with floating point number precision in JavaScript?](https://stackoverflow.com/questions/1458633/how-to-deal-with-floating-point-number-precision-in-javascript) – evolutionxbox Mar 26 '21 at 16:04
  • 1
    `0.10000000000000006` will print fine, but `0.100000000000000006` (one extra zero) will be truncated to `0.10000000000000000` which is the same as `0.1` – evolutionxbox Mar 26 '21 at 16:05
  • 1
    Without an argument, `.toPrecision()` is basically `.toString()`. – Pointy Mar 26 '21 at 16:06

0 Answers0