I have two large byte-arrays (byte[]) that I want need to compare. They should by logically equal but are not always technically equal. The reason being differences in how floating point numbers are represented in these two arrays.
Is there a way of comparing these by idenfitying the floating point numbers and testing if the numbers are within a error tolerance?
I suspect this situation is caused by the fact that one of these byte-arrays has been stored and retreived from a database (postgres).
So far I compare the two byte arrays like so:
public static bool AreEqual(byte[] a, byte[] b)
{
if (a.Length == b.Length)
{
for (var i = 0; i < a.Length; ++i)
{
if (a[i] != b[i])
{
return false;
}
}
}
return true;
}
(Initially I serialized to JSON, but I had the same problem with that. Thought serializing to bytes would fix this, but no).
EDIT: To add context. One of the two byte-arrays is a 'recording' of complex object graphs representing the input and output of a method that I want to test. This byte-array is stored to a postgres database. The second byte-array is the same data, only in memory. I am trying to compare the two as a sort of regression test after refactoring, but the problem is something happens to floating point numbers when they are stored and retreived from the database.
I was hoping for a way of comparing without deserializing both back to their object representations. As far as I know, this is impossible, but I had to ask to be sure.