For example,
0.0000000000000000000000000001
is represented as (lo mid hi flags):
1 0 0 1c0000
When the above is divided by 10, the result is (lo mid hi flags)
0 0 0 0
But when it is multiplied by 0.1M, the result is (lo mid hi flags)
0 0 0 1c0000
In other words, according to Decimal, 0.0000000000000000000000000001 multiplied by 0.1 is 0.0000000000000000000000000000. But divided by 10 it is 0.
The following shows different results:
var o = 0.0000000000000000000000000001M;
Console.WriteLine($"{o * 0.1M}");
Console.WriteLine($"{o / 10M}");
I need to be able to replicate this behaviour and all other Decimal arithmetic in a virtual machine. Can someone point me to a spec or explain the rationale? System.Decimal.cs
does not seem to offer insights.
UPDATE: so it seems this is just a bug in the decimal multiply implementation. Operators should preserve the scale (according to IEEE 754 2008) but multiply does not.