2

I am trying to upload large files to an API from a dotnet core front end.

I have tried to use the code from Microsoft here but they are using angular and I want to do this from within a .net core HttpClient.

I have got it working but the upload uses up the memory of the system and doesn't release it. I can expect files multiple GB in size

My code looks like the following.

I have an http client that looks something like this

public async Task<UploadModel> UploadFileAsync(Stream fileStream, string param1, string param2)
{
    var uri = new Uri("upload", UriKind.Relative);

    try
    {               
        var ch = GetClaimsMessageProcessor();

        var fileStreamContent = new StreamContent(fileStream);

        using (var mfdContent = new MultipartFormDataContent("Upload----" + DateTime.Now.ToString()))
        {
            mfdContent.Add(fileStreamContent);
            var response = await _restClient.PostAsync(uri, mfdContent, callbacks: new Action<HttpRequestMessage>[] { ch.AddClaims,
               (x) => {
                   x.Headers.Add("param1",param1);
                   x.Headers.Add("param2",param1);
                } });

            var content = await response.Content.ReadAsStringAsync();

            if (!response.IsSuccessStatusCode)
            {
                // log and throw exception etc
            }

            return JsonConvert.DeserializeObject<UploadModel>(content);
        }
    }
    catch (Exception e)
    {
        // log etc
    }
}

And my API looks something like this:

[HttpPost("upload")]
[Filters.DisableFormValueModelBinding]
public async Task<IActionResult> Upload()
{
    if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
    {
        return BadRequest($"Expected a multipart request, but got {Request.ContentType}");
    }

    // Get Header Parameters
    string param1, param2;

    try
    {
        param1 = Request.Headers["param1"][0];
        param2 = Request.Headers["param2"][0];
    }
    catch (Exception e)
    {
        // log and throw exception
    }          

    var filePath = GetFilePath();

    if (!Directory.Exists(filePath))
    {
        Directory.CreateDirectory(filePath);
    }

    var targetFilePath = Path.Combine(filePath , GetFileName());

    var boundary = MultipartRequestHelper.GetBoundary(
        MediaTypeHeaderValue.Parse(Request.ContentType),
        _defaultFormOptions.MultipartBoundaryLengthLimit);
    var reader = new MultipartReader(boundary, HttpContext.Request.Body);

    var section = await reader.ReadNextSectionAsync();
    while (section != null)
    {
        using (var targetStream = System.IO.File.Create(targetFilePath))
        {
            await section.Body.CopyToAsync(targetStream);
        }
        section = await reader.ReadNextSectionAsync();
    }

    // Validate 

    // Do stuff with uploaded file
    var model = new UploadedModel
    {
        //some properties
    };

    return Ok(model);
}

I have noticed that there is only one section read. I would assume there should be multiple?

theduck
  • 2,589
  • 13
  • 17
  • 23
chris
  • 21
  • 1
  • I haven't had to do much around this in a while. However, I believe you may need to upload the file in chunks and reassemble them server-side. Something along the lines of - while(CURRENT_CHUNK < TOTAL_CHUNKS) { Upload_Chunk(CURRENT_CHUNK++, packet_size) } and on the server, capture the chunks and store them then assemble them when the last chunk is received. – Zakk Diaz Aug 14 '19 at 21:27
  • https://stackoverflow.com/questions/51422506/c-sharp-upload-file-by-chunks-bad-last-chunk-size – Zakk Diaz Aug 14 '19 at 21:28

0 Answers0