When I'm uploading large files to my web api in ASP.NET Core, the runtime will load the file into memory before my function for processing and storing the upload is fired. With large uploads this becomes an issue as it is both slow and requires more memory. For previous versions of ASP.NET there are some articles on how to disable the buffering of requests, but I'm not able to find any information on how to do this with ASP.NET Core. Is it possible to disable the buffering of requests so I don't run out of memory on my server all the time?
Asked
Active
Viewed 2.6k times
35
-
I write my file upload backend to support uploading files in small chunks according to the api for [flowjs](https://github.com/flowjs/flow.js) – jltrem Apr 05 '16 at 23:57
-
Hello @jltrem, could you please share asp.net core controller and angular code that handles files uploaded using flowjs? – Santosh Prasad Sah Sep 17 '16 at 08:42
3 Answers
29
Use the Microsoft.AspNetCore.WebUtilities.MultipartReader
because it...
can parse any stream [with] minimal buffering. It gives you the headers and body of each section one at a time and then you do what you want with the body of that section (buffer, discard, write to disk, etc.).
Here is a middleware example.
app.Use(async (context, next) =>
{
if (!IsMultipartContentType(context.Request.ContentType))
{
await next();
return;
}
var boundary = GetBoundary(context.Request.ContentType);
var reader = new MultipartReader(boundary, context.Request.Body);
var section = await reader.ReadNextSectionAsync();
while (section != null)
{
// process each image
const int chunkSize = 1024;
var buffer = new byte[chunkSize];
var bytesRead = 0;
var fileName = GetFileName(section.ContentDisposition);
using (var stream = new FileStream(fileName, FileMode.Append))
{
do
{
bytesRead = await section.Body.ReadAsync(buffer, 0, buffer.Length);
stream.Write(buffer, 0, bytesRead);
} while (bytesRead > 0);
}
section = await reader.ReadNextSectionAsync();
}
context.Response.WriteAsync("Done.");
});
Here are the helpers.
private static bool IsMultipartContentType(string contentType)
{
return
!string.IsNullOrEmpty(contentType) &&
contentType.IndexOf("multipart/", StringComparison.OrdinalIgnoreCase) >= 0;
}
private static string GetBoundary(string contentType)
{
var elements = contentType.Split(' ');
var element = elements.Where(entry => entry.StartsWith("boundary=")).First();
var boundary = element.Substring("boundary=".Length);
// Remove quotes
if (boundary.Length >= 2 && boundary[0] == '"' &&
boundary[boundary.Length - 1] == '"')
{
boundary = boundary.Substring(1, boundary.Length - 2);
}
return boundary;
}
private string GetFileName(string contentDisposition)
{
return contentDisposition
.Split(';')
.SingleOrDefault(part => part.Contains("filename"))
.Split('=')
.Last()
.Trim('"');
}
External References

Shaun Luttin
- 133,272
- 81
- 405
- 467
-
1It appears that this code works perfectly on dnx451 but it has a memory leak on dnxcore50. Might be something that is being fixed for RC2 though. – Martin Apr 09 '16 at 12:54
-
1GetFileName() from the snippet above causes server to buffer the whole file. I replaced it with randomized file name, but after writing all chunks to file I see that memory usage increased by the ammount equal to file size. It may be actually ok, if further GC will get rid of it. But seems like not ok – EvAlex May 13 '16 at 07:17
-
@EvAlex `GetFileName` just parses a string. Is there something else that's happening before or after that could be causing the server to buffer the whole file? – Shaun Luttin May 13 '16 at 12:37
-
@ShaunLuttin Thx for sharing the code. Does this also work for files larger than 4 gb? – smedasn Oct 09 '17 at 23:30
-
If the file name contains special characters such as Euro or ", filename* is used instead of filename, so GetFileName() won't necessarily work. See https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Disposition. – Aaron Queenan Apr 19 '18 at 10:49
-
@ShaunLuttin I have a question. when calling Reader.ReadNextSectionAsync() it gets into waiting and code won't go forward and there is not any files in stream. what's the solution? – Reza Akraminejad Oct 23 '18 at 09:27
-
1
-
14
Shaun Luttin's answer is great, and now much of the work he's demonstrated is provided by ASP.Net Core 2.2.
Get the boundary:
// Microsoft.AspNetCore.Http.Extensions.HttpRequestMultipartExtensions
var boundary = Request.GetMultipartBoundary();
if (string.IsNullOrWhiteSpace(boundary))
return BadRequest();
You still get a section as follows:
var reader = new MultipartReader(boundary, Request.Body);
var section = await reader.ReadNextSectionAsync();
Check the disposition and convert to FileMultipartSection:
if (section.GetContentDispositionHeader() != null)
{
var fileSection = section.AsFileSection();
var fileName = fileSection.FileName;
using (var stream = new FileStream(fileName, FileMode.Append))
await fileSection.FileStream.CopyToAsync(stream);
}

Joshua Duxbury
- 4,892
- 4
- 32
- 51

Travis Troyer
- 730
- 6
- 9
-
2The editor marked error on this line `if (section.GetContentDispositionHeader())` _Cannot convert ContentDepositionHeaderValue to bool_ – Natthapol Vanasrivilai Mar 03 '19 at 22:35
-
0
In your Controller
you can simply use Request.Form.Files
to access the files:
[HttpPost("upload")]
public async Task<IActionResult> UploadAsync(CancellationToken cancellationToken)
{
if (!Request.HasFormContentType)
return BadRequest();
var form = Request.Form;
foreach(var formFile in form.Files)
{
using(var readStream = formFile.OpenReadStream())
{
// Do something with the uploaded file
}
}
return Ok();
}

Lukazoid
- 19,016
- 3
- 62
- 85
-
Using Request.Form.Files will cause the uploaded files to be saved to disk if they are bigger than a certain size. ISTR seeing 64MB mentioned somewhere. – Aaron Queenan Apr 19 '18 at 10:46
-
2