In C# if you're using double
, you're not really getting 16.55 - it may print out to that, but that's not the real value. In both cases you should be using a decimal-based type instead of a binary-based type. In C# that's easy - you can use the decimal
(aka System.Decimal
) type. I don't know if there's an equivalent in JavaScript, unfortunately.
You should probably read up on binary floating point and decimal floating point to understand why this is important. (Those are .NET-focused articles, but the principles apply to JS too.)
For currency values, one way to avoid this being a problem is to use integers to start with - keep the number of pennies (cents, whatever) in your value, and then display it by just writing out (number / 100) + "." + (number % 100)
(using a different decimal separator if necessary). I strongly suspect you'll find this the best way to get consistent and predictable results for this particular situation.