If your aim is to reduce memory demands, then don't serialize then encrypt: instead - serialize directly to an encrypting Stream
. The Stream
API is designed to be chained (decorator pattern) to perform multiple transformations without excessive buffering. Likewise: deserialize from a decrypting stream; don't decrypt then deserialize. Done this way, data is encrypted/decrypted on-the-fly as needed; in addition to reducing memory, it is also good for security - since this also means the entire data never exists in decrypted form as a single buffer. See CryptoStream
on MSDN for a full example.
Some additional notes; if you do happen to use protobuf-net, there are ways of reducing any in-memory buffering by using "grouped" encoding; you see: the default for sub-messages (including lists) is "length prefixed" - and the way it usually does this is by buffering the data in memory to calculate the length. However, protobuf also supports a format that uses a start/end marker which never requires knowing the length, so never requires buffering - and so the entire sequence can be written in a single pass direct to output (well, it does still use a buffer internally to improve IO, but it pools the buffer here, for maximum re-use). This is as simple as, for sub-objects:
[ProtoMember(11, DatFormat = DataFormat.Grouped)]
public Customer Customer {get;set;} // a sub-object
(where there is no significance in the 11
)