2

We're upgrading our app, including tests, from .NET Framework to .NET 6.

I know there are globalization changes in .NET 6 (ICU instead of NLS) as well as floating point formatting becoming IEEE compliant.

But the tests have also found some small differences in DateTime arithmetic:

BitConverter.GetBytes(TimeSpan.FromTicks(3124137599999990000).TotalDays)

returns (marginally) different answers between .NET 4.7.2 and .NET 6.0.

Any idea what causes this change? I prefer to understand the differences before updating the tests.

Sorry: the values are:

GetBytes(TotalDays)
    Net4: 231,255,255,255,77,150,75,65
    Net6: 230,255,255,255,77,150,75,65

TotalDays.ToString("G17")
    Net4: 3615899.9999999879
    Net6: 3615899.9999999884

It's the TotalDays calculation that is different (I used GetBytes as an alternative to formatting)

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Rob
  • 4,327
  • 6
  • 29
  • 55
  • 2
    Which bit is the bit that's different? `TimeSpan.FromTicks(3124137599999990000).TotalDays` or `BitConverter.GetBytes()`? – canton7 Mar 28 '23 at 09:30
  • 2
    Can you go in to more detail about what is different preferably showing the results from .NET Framework and .NET – phuzi Mar 28 '23 at 09:32
  • 5
    If I run `TimeSpan.FromTicks(3124137599999990000).TotalDays.ToString("f16")` in .NET Framework, I get `3615899.9999999900000000`, but with .NET 6 I get `3615899.9999999883584678`, so it seems that the difference is a change in the precision. – ProgrammingLlama Mar 28 '23 at 09:34
  • 1
    So looking at the source code ([.NET Framework](https://github.com/microsoft/referencesource/blob/master/mscorlib/system/timespan.cs#L125) and [.NET](https://github.com/dotnet/runtime/blob/389f286226bd71306db616128f95325e47ba45ef/src/libraries/System.Private.CoreLib/src/System/TimeSpan.cs#L180)), it seems that there is a difference in total days. Framework multiplies by `DaysPerTick` whereas .NET divides by `TicksPerDay`. Interestingly, these approaches both give the exact same result in .NET Framework as `mySpan.TotalDays`. It seems to be a change in the precision of the underlying maths. – ProgrammingLlama Mar 28 '23 at 09:50
  • 1
    OK, makes sense - I guess dividing a large double `X` by a large int64 `Y` is likely to be more precise than multiplying `X` by `1/Y`. – Rob Mar 28 '23 at 09:54
  • Does this answer your question? [Is floating point math broken?](https://stackoverflow.com/questions/588004/is-floating-point-math-broken) – Charlieface Mar 28 '23 at 11:53
  • @Charlieface - no, this is a difference in the calculated value between Net6 and NetFramework. – Rob Mar 28 '23 at 15:51

0 Answers0