0

My friend stumbled into a strange error (at least I think it is an error). When you multiply a certain number in the console of your browser, 1.4 in my case, then the output is very interesting when you multiply it with 3, 6 and 7. Check these outputs in Firefox:

enter image description here

In Chrome it is the same as in Firefox, in Edge and IE it's even more interesting, there is the outcome by the multiplication with 6 even more off:

enter image description here

I googled it but couldn't find any explanation about this error. Does anyone know why the browsers fail at this multiplication?

EDIT: The comments below explain why the number isn't just let's say 4.2 but a heck of a lot numbers which gets a 4.199999999999999. Thanks for referring me to those answers.

However, why does this only happen by 3, 6 and 7 and not with the other numbers? (Maybe at number higher than 10 it will happen to, didn't check that.)

Snake
  • 283
  • 4
  • 10

1 Answers1

1

Your friend stumbled across floating point representation of numbers.

Below quote is taken from this answer: https://stackoverflow.com/a/21895757/210971

In most programming languages, floating point numbers are represented a lot like scientific notation: with an exponent and a mantissa (also called the significand). A very simple number, say 9.2, is actually this fraction:

5179139571476070 * 2 -49

Where the exponent is -49 and the mantissa is 5179139571476070. The reason it is impossible to represent some decimal numbers this way is that both the exponent and the mantissa must be integers. In other words, all floats must be an integer multiplied by an integer power of 2.

Community
  • 1
  • 1
LookAheadAtYourTypes
  • 1,629
  • 2
  • 20
  • 35