6

I am wanting to upload videos (up to 2GB) to my asp net core / 5 server from a Blazor Webassembly app.

I am successfully using IformFile for smaller files and that works fine.

I have studied a variety of sources: This post was very helpful in terms of the basics and making it work. This one is also very explanatory and gives a method of getting progress of uploading - which I want

I have also used the samples from Microsoft which uses streaming, I have built the MVC app made it work and tried it in the Blazor App, both using the Javasript Form (which works fine, but I cna't see how to get progress) and using code behind which works for smaller files, but when I try a huge 1.4GB file, it runs out of memory.

The server controller is pretty straight forward and comes from the .Net samples:

    [HttpPost]
    [DisableFormValueModelBinding]
    [RequestSizeLimit(4294967296)]
    [RequestFormLimits(MultipartBodyLengthLimit = 4294967296)]

    //[ValidateAntiForgeryToken] (just for now..)
    [Route("UploadFile")]
    public async Task<IActionResult> UploadFile()
    {
        if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
        {
            ModelState.AddModelError("File", 
                $"The request couldn't be processed (Error 1).");
            // Log error

            return BadRequest(ModelState);
        }

        var boundary = MultipartRequestHelper.GetBoundary(
            MediaTypeHeaderValue.Parse(Request.ContentType),
            _defaultFormOptions.MultipartBoundaryLengthLimit);
        var reader = new MultipartReader(boundary, HttpContext.Request.Body);
        var section = await reader.ReadNextSectionAsync();

        while (section != null)
        {
            var hasContentDispositionHeader = 
                ContentDispositionHeaderValue.TryParse(
                    section.ContentDisposition, out var contentDisposition);

            if (hasContentDispositionHeader)
            {
  ....

            }

            // Drain any remaining section body that hasn't been consumed and
            // read the headers for the next section.
            section = await reader.ReadNextSectionAsync();
        }

        return Created(nameof(StreamingController), null);
    }

In my Blazor Test Page, I have a couple of trial inputs. This one:

                <h1>Upload Files</h1>

                <p>
                    <label>
                        Upload up to @maxAllowedFiles files:
                        <InputFile OnChange="@OnInputFileChange" multiple />
                    </label>
                </p>

                @if (files.Count > 0)
                {
              .....
                }

From a microsoft sample uses this code:

    private async Task OnInputFileChange(InputFileChangeEventArgs e)
    {
        shouldRender = false;
        var upload = false;

        using var content = new MultipartFormDataContent();

        foreach (var file in e.GetMultipleFiles(maxAllowedFiles))
        {
            try
            {


                if (uploadResults.SingleOrDefault(
                    f => f.FileName == file.Name) is null)
                {
                    var fileContent = new StreamContent(file.OpenReadStream(maxFileSize),BufferSize);

                    files.Add(
                        new File()
                        {
                            Name = file.Name,
                        });

                    if (file.Size < maxFileSize)
                    {
                        content.Add(
                            content: fileContent,
                            name: "\"files\"",
                            fileName: file.Name);

                        upload = true;
                    }
                    else
                    {
                        //ILogger.LogInformation("{FileName} not uploaded", file.Name);

                        uploadResults.Add(
                            new UploadResult()
                            {
                                FileName = file.Name,
                                ErrorCode = 6,
                                Uploaded = false,
                            });
                    }
                }
            }
            catch(Exception ex)
            {
                Console.WriteLine($"Error: {ex.Message}");
            }
        }
        if (upload)
        {
            var response = await Http.PostAsync(Routes.UploadFileRoute, content);
            var newUploadResults = await response.Content
                .ReadFromJsonAsync<IList<UploadResult>>();
            uploadResults = uploadResults.Concat(newUploadResults).ToList();
        }
        shouldRender = true;
    }

I also have this form from the sample app:

        <h3>From SampleApp</h3>
    <form id="uploadForm" action="@UploadRoute" method="post"
          enctype="multipart/form-data" onsubmit="AJAXSubmit(this);return false;">
        <dl>
            <dt>
                <label for="file">File</label>
            </dt>
            <dd>
                <input id="file" type="file" name="file" />
            </dd>
        </dl>

        <input class="btn" type="submit" value="Upload" />

        <div style="margin-top:15px">
            <output form="uploadForm" name="result"></output>
        </div>
    </form>

and its associated Javascript loaded straight into the wwwroot index.html (note this is straight html/javascript... no Blazor JSInterop going on here...

        <script>
              "use strict";
              async function AJAXSubmit(oFormElement)
                  {
                      const formData = new FormData(oFormElement);
                      try
                      {
                          const response = await fetch(oFormElement.action, {
                          method: 'POST',
                    headers:
                              {
                                  'RequestVerificationToken': getCookie('RequestVerificationToken')
                    },
                    body: formData
                          });

                          oFormElement.elements.namedItem("result").value =
                            'Result: ' + response.status + ' ' + response.statusText;
                      }
                      catch (error)
                      {
                          console.error('Error:', error);
                      }
                  }
                  function getCookie(name)
                  {
                      var value = "; " + document.cookie;
                      var parts = value.split("; " + name + "=");
                      if (parts.length == 2) return parts.pop().split(";").shift();
                  }
</script>

This is straight out of the samples from microsoft.

When I upload a 1.4GB file using the Blazor Inputfile approach (even though supposedly streamed) I can watch the the task manager and see that Chrome's memory usage (and Visual Studios) is building up until the app dies with an out of memory error. Also, having set a breakpoint on the controller entry point, it never gets to it, to break. Just a delay as some buffering is going on in the client (I guess?) and then boom. (I only have 8G Ram in my laptop which expires, it might work if I had more, but thats not the issue, it is that it is building in the browser at all)...

If I do the same thing using the form and JS, then the breakpoint is hit straight away and when continuing the memory usage of chrome increases only slightly before leveling off. VS memory still grows but not too much and I can watch the buffering work.

Is this me misunderstanding how StreamContent actually works, or is it a Blazorizm? I think I saw somewhere that the fetch was using some JS Interop or something which prevented it buffering properly? But that was in an old post pre net core 3.1 and I am using 5 now so would hope that was sorted.

I have also tried Blazorise because it seems to have some events that will allow progress monitoring, but I cant find out how to use those, since they seem to fire as a buffer is built up in the client and is over before anything hits the server. Which sort of backs up the theory above..

Thanks in advance....

Brett JB
  • 687
  • 7
  • 24

1 Answers1

1

I have been meaning to answer this for a while, but now I have had a specific request, here it is...

I solved this as follows. I am using Azure storage for my data, so you will need to vary the strategy for your own solution. It also deals with the provision of the progress bars and multiple files. The Azure storage library has a blockblob set of functions that allows for this very scenario. You might want to check them out if you want to roll your own for 'your own' storage...

The client side 'FileService' function that deals with this is here:

        public async Task<List<ExternalFileDTO>> HandleFilesUpload(FileChangedEventArgs e, IProgress<Tuple<int, int, string>> progressHandler,
        IProgress<Tuple<int,string>> fileCountHandler, ExternalFileDTO fileTemplate, CancellationToken cancellationToken = default)
    {

        int FileCount =0;
        fileTemplate.CreatedBy = _userservice.CurrentUser;
        Tuple<int, string> reportfile;
        Tuple<int, int, string> CountProgressName;
        List<ExternalFileDTO> NewFilesList = new List<ExternalFileDTO>();
        foreach (var file in e.Files)
        {
            FileCount++;
            reportfile = Tuple.Create(FileCount, file.Name);
            ExternalFileDTO currentfile = new ExternalFileDTO(); // Set up a current file in case of cancel

            fileCountHandler.Report(reportfile);
            try
            {

                // A bit clumsy, but in the event of an error, the Newfilelist gets a single ExtFileDTO added with a status
                // and error message. This because the caller gets a list of the files added or not with status.
                // if it fails partway through then it will be the list up to that point and the error
                if (file == null)
                {
                    fileTemplate.Status= new InfoBool(false, "File is null");
                    NewFilesList.Add(new ExternalFileDTO(fileTemplate));
                    return NewFilesList;

                }
                long filesize = file.Size;
                if (filesize > maxFileSize)
                {
                    fileTemplate.Status = new InfoBool(false, "File exceeds Max Size");
                    NewFilesList.Add(new ExternalFileDTO(fileTemplate));
                    return NewFilesList;

                }
                fileTemplate.OriginalFileName = file.Name;
                fileTemplate.FileType = file.Type;

                var sendfile = await _azureservice.GetAzureUploadURLFile(fileTemplate);
                if (!sendfile.Status.Success) // There was an error so return the details
                {

                    NewFilesList.Add(sendfile);
                    return NewFilesList;
                }
                currentfile = sendfile; // This allows the current file to be passed out of the loop if cancelled

                BlockBlobClient blockBlobclient = new BlockBlobClient(sendfile.CloudURI);

                // BlobHttpHeaders blobheader = new BlobHttpHeaders { ContentType = fileTemplate.FileType };

                // blockBlobclient.SetHttpHeaders(blobheader);

                byte[] buffer = new byte[BufferSize];
                using (var bufferedStream = new BufferedStream(file.OpenReadStream(maxFileSize), BufferSize))
                {
                    int readCount = 0;
                    int bytesRead;
                    long TotalBytesSent = 0;
                    // track the current block number as the code iterates through the file
                    int blockNumber = 0;

                    // Create list to track blockIds, it will be needed after the loop
                    List<string> blockList = new List<string>();

                    while ((bytesRead = await bufferedStream.ReadAsync(buffer, 0, BufferSize)) > 0)
                    {
                        blockNumber++;
                        // set block ID as a string and convert it to Base64 which is the required format
                        string blockId = $"{blockNumber:0000000}";
                        string base64BlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(blockId));

                        Console.WriteLine($"Read:{readCount++} {bytesRead / (double)BufferSize} MB");

                        // Do work on the block of data
                        await blockBlobclient.StageBlockAsync(base64BlockId, new MemoryStream(buffer, 0, bytesRead), null, null, null, cancellationToken);
                        // add the current blockId into our list
                        blockList.Add(base64BlockId);
                        TotalBytesSent += bytesRead;
                        int PercentageSent = (int)(TotalBytesSent * 100 / filesize);
                        CountProgressName = Tuple.Create(FileCount, PercentageSent, file.Name);
                        //
                        // I'm a bit confused as I don't know how the OperationCancelledException is thrown
                        // But it is!
                        // 
                        progressHandler.Report(CountProgressName);
                    }

                    await blockBlobclient.CommitBlockListAsync(blockList, null, cancellationToken);

                    // make sure to dispose the stream once your are done
                    bufferedStream.Dispose();   // Belt and braces
                }
                //
                // Now make a server API call to verify the file upload was successful
                //
                sendfile.Status = new InfoBool(false, "File upload succeeded");
                currentfile = await _azureservice.VerifyFileAsync(sendfile);
                if (currentfile.Status.Success)
                {
                    // On;y add the file into the Current Live files list if the result was successful
                    CurrentFiles.Add(currentfile); // Add the returned sendfile object to the list
                }
                NewFilesList.Add(currentfile); // Add to this anyway since the result is diplayerd to the user

            }
            catch (OperationCanceledException cancelEx)
            {
                Console.WriteLine(cancelEx.Message);
                currentfile.Status = new InfoBool(true, cancelEx.Message); // Set the file status as cancelled
                await _azureservice.VerifyFileAsync(currentfile);
            }
            catch (Exception exc)
            {
                Console.WriteLine(exc.Message);
            }
            finally
            {
               
            }
        }
        return NewFilesList; // Return the list of files uploaded or not.. 

    }

This bit of UI allows the user to creat a list of files for upload:

@if (IsNewItem)
            {
                @if (fileArgs != null && fileArgs.Files.Count() > 0)
                {
                    <div>Selected FIles:</div>
                    <ul>
                        @foreach (var file in @fileArgs.Files)
                        {
                            <li>@file.Name</li>
                        }
                    </ul>
                    if (!IsUploading)
                    {
                        <Button Color="Color.Primary" Clicked="@UploadFiles">Upload Files</Button>
                    }
                }
                @if (IsUploading)
                {
                    <div>
                        <div> Uploading @FileNameUploading</div>
                        <div> @FileCount of @TotalFilesUploading files - File Progress % = @ProgressPercent %</div>
                    </div>

                    <div>
                        <Progress>
                            <ProgressBar Value="@ProgressPercent" Color="Color.Success" />
                        </Progress>
                    </div>

                    <Button Color="Color.Danger" Clicked="@CancelUpload">Cancel</Button>
                }
                else
                {
                    <Span>Uploading is FALSE</Span>
                }
                <div >
                    <Blazorise.FileEdit @ref="fileedit" Changed="@BLOnFilesChanged"
                        Filter=@FileService.AllowedFileTypes Multiple="true" />
                </div>

            }

I used the Blazorise 'FileEdit' more for the UI consistency than its functionality in the end.

The 'UploadFiles' function in the code section looks like this:

    async Task UploadFiles()
    {

            TotalFilesUploading = fileArgs.Files.Count();
            IsUploading = true;
            // throw the Fileupload selection to the File Service
            List<ExternalFileDTO> sentfiles = await FileService.HandleFilesUpload(fileArgs, progressHandler, fileCountHandler, editUserFile, cts.Token);

            IsUploading = false;

            StringBuilder bob = new StringBuilder();
            bob.Append("File Upload Status:<br />");
            foreach (ExternalFileDTO file in sentfiles)
            {
                bob.Append($"FIle:{file.OriginalFileName} - {file.Status.Reason}</div><br />");
            }
            if (sentfiles.Count > 0)
            {
                ShowStatusMsg(bob.ToString());
            }
            TotalFilesUploading = 0;    //Clear down the vars
            HideModal();    //File has finished uploading close and refresh
            await OnUserFilesUploaded.InvokeAsync();
            this.StateHasChanged();     // Refresh the display

    }

... and finally, the progress callbacks are quite simple in the end. Here is some housekeeping code that the page (form in my case) uses to set things up.

        protected override void OnInitialized()
    
    {
        progressHandler = new Progress<Tuple<int, int, string>>(UploadProgressChanged);
        fileCountHandler = new Progress<Tuple<int,string>>(FileCountChanged);
        FileService.CurrentFilesChanged += Refresh; // Refresh the page on the change of files
    }

    private void FileCountChanged(Tuple<int,string> FileNoAndNameSending)
    {
        Console.WriteLine($"FileCount Changed  = {FileNoAndNameSending}");
        FileCount = FileNoAndNameSending.Item1;
        FileNameUploading = FileNoAndNameSending.Item2;
        this.StateHasChanged();
    }

    private void UploadProgressChanged(Tuple<int, int, string> CountProgressName)
    {
        Console.WriteLine($"File Name: {CountProgressName.Item3} /n Fileno: {CountProgressName.Item1}, Upload Progress Changed Percentage = {CountProgressName.Item2}");
        FileCount = CountProgressName.Item1;
        ProgressPercent = CountProgressName.Item2;
        FileNameUploading = CountProgressName.Item3;
        if (FileCount >= TotalFilesUploading && ProgressPercent >=100)
        {
            // This is the last file and it is complete
            Console.WriteLine($"Last File reached at 100%");
        }
        Refresh(); // Update the display
    }

I am sure that this is not ideal but it does work and seems pretty robust. One thing I did have to take care with was making sure the VeryFile service did a good job. I still have a nagging doubt if the catch happens properly in the event of Azure cancelling with some kind of error. But that is for a later time.

Hope this helps.

Brett JB
  • 687
  • 7
  • 24