I have complex test cases with geometric entities. It is easy for me to visually verify that the test case has passed by looking at the generated geometry in a viewport like so.
When I am satisfied that the test is passing I take a hash of the geometry.
var hashCode = GetHashCodeForRegionResult(region);
hashCode.Should().Be(1243932659);
the calculation involves taking the hash of floating point numbers. Ordinarily this would be a bad thing to do. However if I input the same data to the algorithm then I would expect exactly the same result down to the bit level. Is this expectation valid especially if I am running the same test in the .Net 4.5.1 runtime on different CPU's, AMD vs INTEL, 64 bit vs 32 bit?