0

I have a problem when I upload the large file on Azure. I am working on an ASP.NET Core 5.0 API project. I have implemented functionality regarding Microsoft recommendation. Moreover, I added a pooling mechanism so the frontend application has another endpoint to check upload status. Everything works fine when I run locally but I have a problem with a large file on Azure. My API is using Azure App Service Premium P1v3. It returns a 502 bad gateway for large files (above 1GB).

I made a tests and 98 % time consuming is reading stream. From Microsft docs it is:

if (MultipartRequestHelper
                .HasFileContentDisposition(contentDisposition))
            {
                untrustedFileNameForStorage = contentDisposition.FileName.Value;
                // Don't trust the file name sent by the client. To display
                // the file name, HTML-encode the value.
                trustedFileNameForDisplay = WebUtility.HtmlEncode(
                        contentDisposition.FileName.Value);

                streamedFileContent = 
                    await FileHelpers.ProcessStreamedFile(section, contentDisposition, 
                        ModelState, _permittedExtensions, _fileSizeLimit);

                if (!ModelState.IsValid)
                {
                    return BadRequest(ModelState);
                }
            }

I know there is a load balancer timeout of 230 seconds on Azure App Service but when I test it using postman in most cases 502 is being returned after 30 seconds.

Maybe I need to set some configuration feature on Azure App Service? Always on is enabled.

I would like to stay with Azure App Service, but I was thinking about migrating to Azure App service or allow the Frontend application to upload files directly to Azure Blob Storage.

Do you have any idea how to solve it?

mskuratowski
  • 4,014
  • 15
  • 58
  • 109
  • If my reply is helpful, please accept it as answer(click on the mark option beside the reply to toggle it from greyed out to fill in.), see https://meta.stackexchange.com/questions/5234/how-does-accepting-an-answer-work – Jason Pan Mar 30 '21 at 06:41
  • Hello, I am facing a very similar issue but using NodeJs, were you able to solve this? I have been struggling with this for days – Farid Hajnal Jun 25 '21 at 15:34

2 Answers2

0

Newset

Uploading and Downloading large files in ASP.NET Core 3.1?

The previous answers are based on only using app services, but it is not recommended to store large files in app services. The first is that future updates will become slower and slower, and the second is that the disk space will soon be used up.

So it is recommended to use azure storage. If you use azure storage, suggestion 2 is recommended for larger files to upload large files in chunks.

Preview

Please confirm whether the large file can be transferred successfully even if the error message returns a 500 error.

I have studied this phenomenon before, and each browser is different, and the 500 error time is roughly between 230s-300s. But looking through the log, the program continues to run.

Related Post:

The request timed out. The web server failed to respond within the specified time

So there are two suggestions I give, you can refer to:

Suggestion 1:

It is recommended to create an http interface (assuming the name is getStatus) in your program to receive file upload progress, similar to processbar. When the file starts to transfer, monitor the upload progress, upload the file interface, return HttpCode 201 accept, then the status value is obtained through getStatus, when it reaches 100%, it returns success.

Suggestion 2:

Use MultipartRequestHelper to cut/slice large file. Your usage maybe wrong. Please refer below post.

Dealing with large file uploads on ASP.NET Core 1.0

The version of .net core is inconsistent, but the idea is the same.

Jason Pan
  • 15,263
  • 1
  • 14
  • 29
  • I have implemented Suggestion 1 approach and as I said it works locally but on Azure after 30 seconds I get 502 HTTP error. – mskuratowski Mar 30 '21 at 06:27
  • @mskuratowski You must have created resources on azure portal, right? What I mean is to follow the azure portal, you don’t have to worry about uploading a file and return a message, such as start uploading. – Jason Pan Mar 30 '21 at 06:31
  • @mskuratowski https://learn.microsoft.com/en-us/rest/api/appservice/appserviceplans/createorupdate#response [Like 202 accepted](https://i.stack.imgur.com/imtAw.png). – Jason Pan Mar 30 '21 at 06:33
  • @mskuratowski App services does not recommend storing files, but uploading large files to blobs in chunks. – Jason Pan Mar 30 '21 at 06:39
  • @mskuratowski https://stackoverflow.com/questions/62502286/uploading-and-downloading-large-files-in-asp-net-core-3-1 – Jason Pan Mar 30 '21 at 06:40
  • I know that I can use Azure Blob storage directly. How about Azure function? – mskuratowski Mar 30 '21 at 07:08
  • In suggestion 1 as I said read files without uploading time takes 98 % of request time. – mskuratowski Mar 30 '21 at 07:09
  • @mskuratowski Using Azure function is also to create httptrigger, and the HTTP request is still sent, so it is still not feasible. – Jason Pan Mar 30 '21 at 07:10
  • @mskuratowski https://www.skyfinder.cc/2020/12/31/asp-net-core-big-file-uploader/ You can look it up and see, this is the best way to upload large files. – Jason Pan Mar 30 '21 at 07:11
0

Facing similar issue on uploading document of larger size(up to 100MB) through as.net core api hosted as azure app gateway and have set timeout to 10min and applied these attributes on action
[RequestFormLimits(MultipartBodyLengthLimit = 209715200)] [RequestSizeLimit(209715200)] Even kestrel has configured to accept 200MB UseKestrel(options =>

               {
                    options.Limits.MaxRequestBodySize = 209715200;
                    options.Limits.KeepAliveTimeout = TimeSpan.FromMinutes(10);
                });

The file content is in base64 format in request object. Appreciate if any help on this problem.