I have a code that get all files on a directory, compresses each one and creates a .zip file. I'm using the .NET Framework ZipArchive class on the System.IO.Compression namespace and the extension method CreateEntryFromFile. This is working well except when processing large files (aproximately 1GB and up), there it throws a System.IO.Stream Exception "Stream too large"
.
On the extension method reference on MSDN it states that:
When ZipArchiveMode.Update is present, the size limit of an entry is limited to Int32.MaxValue. This limit is because update mode uses a MemoryStream internally to allow the seeking required when updating an archive, and MemoryStream has a maximum equal to the size of an int.
So this explains the exception I get, but provides no further way of how to overcome this limitation. How can I allow large file proccesing?
Here is my code, its part of a class, just in case, the GetDatabaseBackupFiles()
and GetDatabaseCompressedBackupFiles()
functions returns a list of FileInfo objects that I iterate:
public void CompressBackupFiles()
{
var originalFiles = GetDatabaseBackupFiles();
var compressedFiles = GetDatabaseCompressedBackupFiles();
var pendingFiles = originalFiles.Where(c => compressedFiles.All(d => Path.GetFileName(d.Name) != Path.GetFileName(c.Name)));
foreach (var file in pendingFiles)
{
var zipPath = Path.Combine(_options.ZippedBackupFilesBasePath, Path.GetFileNameWithoutExtension(file.Name) + ".zip");
using (ZipArchive archive = ZipFile.Open(zipPath, ZipArchiveMode.Update))
{
archive.CreateEntryFromFile(file.FullName, Path.GetFileName(file.Name));
}
}
DeleteFiles(originalFiles);
}