sI have the following use case. I have a web service for downloading csv files.
Csv files are formed on the go, i.e., some data is retrieved from the database row by row and converted in the runtime into csv, the csv then, is put into MemoryStream
that is returned by the web service for downloading. Even though the generation of the csv file is pretty fast, its size can grow over 2GB, and when its size does grow over 2GB OutOfMemoryException
is thrown because MemoryStream
can not handle so much data.
This is a test code snippet I wrote to better illustrate the problem:
//This is my WCF WebService
public class DownloadService : IDownloadService
{
//This is the method for download csv
public Stream DownloadFile()
{
var users = GetUsers();
using (var memoryStream = new MemoryStream())
using (var streamWriter = new StreamWriter(memoryStream))
{
foreach (var user in users)
{
var csvRow = $"{user.Id},{user.FirstName},{user.LastName}\n";
streamWriter.Write(csvRow);
//When the size exceeds 2GB exception will be thrown here
memoryStream.Flush();
}
WebOperationContext.Current.OutgoingResponse.Headers.Add("Content-Disposition", $"attachment; filename=Users.csv");
WebOperationContext.Current.OutgoingResponse.ContentType = "application/csv";
return memoryStream;
}
}
//Method that returns Users from the database
private IEnumerable<User> GetUsers()
{
string cmdQuery = "select Id, FirstName, LastName from User";
using (var connection = new SqlConnection("some connection string"))
using (SqlCommand cmd = new SqlCommand(cmdQuery, connection))
{
connection.Open();
using (SqlDataReader reader = cmd.ExecuteReader())
{
while (reader.Read())
{
yield return new User
{
Id = (int)reader["Id"],
FirstName = reader["FirstName"].ToString(),
LastName = reader["LastName"].ToString()
};
}
}
}
}
}
public class User
{
public int Id { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
}
Is there any way I can bypass this limit, or maybe use another stream?
Please note the application is 32 bit and I would like not to use FileStream
because, as I said, the generation is relatively fast, and if I use FileStream
I will have to manage all the infrastructure to store and retrieve files, which in my opinion will be redundant, and may slow down the whole process too.