I have a large Fortran/C++ project that assembles hundreds of Fortran intermediate files into a single executive. When I monitor some of the global single precision floating point variables, I get different results when I run the executive on a Windows 7 x64 machine vs a Windows XP SP2 x86 machine. The differences are as much as 1-2%.
The project was built on the x86 machine and not rebuilt before testing on the x64 machine, although I am using the exact same compiler (compaq visual fortran 6.6), and development studio (visual studio 6.0), and identical code for both machines. The x64 machine has a Pentium E5400, the x86 machine has a pentium 4 dual core. Could this be an example of Deterministic Lockstep?.
I know this is vague - I wish I could provide some code, but there's over 1 million lines. All of the variables are REAL*4
and are calculated in the Fortran code several hundred times per second. The c++ MFC code assembles it into the executive.