We have a solution where we are storing a fairly large/complex C# object in our database as binary data. My concern is that when changes are made to this class, we run the risk that data saved to the database will fail on deserialization after the code change.
Here are is the code we're using to serialize objects:
public static byte[] SerializeObject(object toBeSerialized)
{
var stream = new MemoryStream();
var serializer = new BinaryFormatter();
serializer.Serialize(stream, toBeSerialized);
stream.Position = 0;
return stream.ToArray();
}
Here is our Deserialize method:
public static T DeserializeObject<T>(byte[] toBeDeserialized)
{
using (var input = new MemoryStream(toBeDeserialized))
{
var formatter = new BinaryFormatter();
input.Seek(0, SeekOrigin.Begin);
return (T) formatter.Deserialize(input);
}
}
My question is, what has to change/how much has to change in order for deserialization of an older object to fail?