7

Serializing/deserializing with BinaryFormatter, resulting serialized file is ~80MB in size. The deserialization takes a few minutes. How could I improve on this? Here's the deserialization code:

    public static Universe DeserializeFromFile(string filepath)
    {
        Universe universe = null;

        FileStream fs = new FileStream(filepath, FileMode.Open);

        BinaryFormatter bf = new BinaryFormatter();
        try
        {
            universe = (Universe)bf.Deserialize(fs);
        }
        catch (SerializationException e)
        {
            Console.WriteLine("Failed to deserialize. Reason: " + e.Message);
            throw;
        }
        finally
        {
            fs.Close();
        }

        return universe;
    }

Maybe read all to memory prior to deserializing or use some other serialization technique?

Carlsberg
  • 659
  • 2
  • 7
  • 8

6 Answers6

2

Try UnsafeDeserialize. It is said to improve speed.

leppie
  • 115,091
  • 17
  • 196
  • 297
  • UnsafeDeserialize 460138 ms, Deserialize 459967 ms.. ie. Deserialize was actually faster! I set the headers with the UnsafeDeserialize to null, is this perhaps the reason? – Carlsberg Oct 25 '09 at 14:00
2

I know this is an old question, but stumbled upon a solution that improved my deserialization speed substantially. This is useful if you have large sets of data.

Upgrade your target framework to 4.7.1+ and enable the following switch in your app.config.

<runtime>
    <!-- Use this switch to make BinaryFormatter fast with large object graphs starting with .NET 4.7.2 -->
    <AppContextSwitchOverrides value="Switch.System.Runtime.Serialization.UseNewMaxArraySize=true" />
</runtime>

Sources: BinaryFormatter AppContextSwitchOverrides

scuba88
  • 458
  • 2
  • 4
  • 13
0

Please take a look at this thread.

Community
  • 1
  • 1
KV Prajapati
  • 93,659
  • 19
  • 148
  • 186
0

Try reading the file into a memory stream first in one go, then deserialize using the memory stream.

Storm
  • 4,307
  • 11
  • 40
  • 57
  • 3
    If that makes things better rather than worse, then the serialization format sucks. Why do an I/O bound task *followed by* a CPU-bound task when you could do both, interleaved? – hobbs Oct 25 '09 at 12:05
0

How complex is the data? If it is an object tree (rather than a full graph), then you might get some interesting results from trying protobuf-net. It is generally pretty easy to fit onto existing classes, and is generally much smaller, faster, and less brittle (you can change the object model without trashing the data).

Disclosure: I'm the author, so might be biased - but it really isn't terrible... I'd happily lend some* time to help you try it, though.

*=within reason

Marc Gravell
  • 1,026,079
  • 266
  • 2,566
  • 2,900
-1

Implement ISerializable in the Universe class

Gary
  • 1,847
  • 2
  • 14
  • 22