1

I'm fairly new to System.IO streams, and are therefore not entirely sure when and how I should use the different streams.

Let me explain my use-case:

Currently, I have a MS SQL database with a FileStream installation, in which I store FileName, Byte[] and Section for the files. i.e.

public partial class MyFiles {
    public int Id { get; set; }
    public int Section { get; set; } 
    public string FileName { get; set; }
    public byte[] Data { get; set; }
}

At some point, I want to be able to download all the files that belongs to a specific section. I therefore want to:

  • Query the files specific to a section
  • Write to a ZipArchive
  • Pass the zipped file as a FileContentResult

I have decided to use a MemoryStream to achieve this, because its fast and convenient in the sense that I don't have to use the filesystem on the server. The implementation looks as follows:

MemoryStream stream;
using (stream = new MemoryStream())
{
    using (var zipArchive = new ZipArchive(stream, ZipArchiveMode.Create))
    {

        foreach (MyFiles file in fetchedFiles)
        {
            var fileEntry = zipArchive.CreateEntry(file.FileName);
            using (var entryStream = fileEntry.Open())
            {
                entryStream.Write(file.Data, 0, file.Data.Length);
            }
        }
    }
}
return new SuccessResult<MemoryStream>(stream);

Everything is working, and I can successfully retreive my zipped files..

However, now I'm starting to doubt this implementation, as this possibly could end up handling files that combined can add up to say, 512MB - 1GB..

The server is really powerful, but obviously I don't want to burn all the memory in this process.

Am I moving in a wrong direction with MemoryStream and should I ideally consider something else?

Jeppe Christensen
  • 1,680
  • 2
  • 21
  • 50
  • If an app attempts to buffer too many uploads/downloads, the site crashes when it runs out of memory or disk space. If the size or frequency of file uploads/downloads is exhausting app resources, use **[streaming](https://docs.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-3.1#upload-large-files-with-streaming-1)**. check the following thread: [Uploading and Downloading large files in ASP.NET Core 3.1](https://stackoverflow.com/questions/62502286/) and [C# Download big file from Server with less memory consumption](https://stackoverflow.com/questions/43804446/). – Zhi Lv Sep 11 '20 at 05:52

1 Answers1

0

The other thing to consider is you actually have the files in memory twice - once in fetchedFiles & then again in the zip file.

Ideally you need to stream the files from the DB to the output stream, or indeed the zip file.

Rather than get all the files and then add them to the zip, open your zip file and then stream each file into the zip file.

Simon Halsey
  • 5,459
  • 1
  • 21
  • 32
  • Oh, so instead you would do individual queries for files inside `entryStream.Write(...)` instead of getting the entire IEnumerable? In case I sort this, would you then not be worried about OutOfMemoryExceptions? – Jeppe Christensen Sep 10 '20 at 10:48