I need some help understanding this, why it's happening and what I can do to fix it.
I got object A with some properties and a list of object B. Object B contains a decimal. Object A gets serialized into JSON as some point and saved for later use.
If i set object Bs decimal value to 100, serialize and then deserialize, the value returns as 100.0, causing the list comparison to return false.
Which means I can't compare an Object A that has been serialized and deserialized to an Object A that hasn't.
Copy/paste sample code below:
List<decimal> items = new List<decimal>();
items.Add(1.0M);
List<decimal> items2 = new List<decimal>();
items2.Add(1);
Console.WriteLine(items == items2); //false
decimal d = 1.0M;
decimal d2 = 1;
Console.WriteLine(d == d2); //true
Edit - added code example roughly what in my code:
public class ObjectA
{
public string Name { get; set; }
public List<ObjectB> Items { get; set; }
}
public class ObjectB
{
public decimal Amount { get; set; }
}
static void Main(string[] args)
{
ObjectA a1 = new ObjectA() { Name = "1", Items = new List<ObjectB>() { new ObjectB() { Amount = 1 } } };
ObjectA a2 = new ObjectA() { Name = "1", Items = new List<ObjectB>() { new ObjectB() { Amount = 1.0M } } };
Console.WriteLine(a1 == a2); //false
Console.WriteLine(a1.Items.SequenceEqual(a2.Items)); //false
}