2

I am trying to upload a files and small files work, but large ones seem to fail after 2:16. The Max file length I have i expect to take about 20 minutes and 3600 is an hour. Have I set the wrong attribute?

<?xml version="1.0" encoding="utf-8"?>
<configuration>
<system.web>
    <authorization>
        <deny users="?" />
    </authorization>
  <httpRuntime executionTimeout="3600" maxRequestLength="2097151" requestValidationMode="2.0"/>
</system.web>
</configuration>
Brian Mains
  • 50,520
  • 35
  • 148
  • 257
StephanM
  • 733
  • 3
  • 22
  • 36
  • any idea why i would be getting this then?? This problem can be caused by a variety of issues, including: •Internet connectivity has been lost. •The website is temporarily unavailable. •The Domain Name Server (DNS) is not reachable. •The Domain Name Server (DNS) does not have a listing for the website's domain. •There might be a typing error in the address. •If this is an HTTPS (secure) address, click Tools, click Internet Options, click Advanced, and check to be sure the SSL and TLS protocols are enabled under the security section. – StephanM Nov 10 '11 at 20:33
  • I am uploading a large file using the upload control. – StephanM Nov 10 '11 at 20:35
  • The file upload control does not stream it will attempt to hold the file in memory and will cause a OOM condition or a worker process recycle. See my answer for long discussion. – MatthewMartin Nov 10 '11 at 20:40
  • 1
    There's a few responses to similar issues: http://stackoverflow.com/questions/1021243/file-upload-issue – Chad Kapatch Nov 10 '11 at 20:41

1 Answers1

1

The usual problem is that the code that received the uploaded file will put the results into a byte array (byte[])

Those byte arrays are held entirely in memory. Depending on the version of your OS, the web server, amount of memory, etc, usually somewhere around 800MB of memory usage, IIS will recycle the worker process. This is done so that the entire server doesn't go down because a single request is using excessive amounts of memory.

Third party file uploaders use a variety of techniques to stream files a chunk at a time and can be used to upload files of muliple GBs without having memory usage go above a handful of kilobytes.

The streaming technique also has to be maintained for all layers of code that touch the file-- i.e. if a component writes it to file, it must stream and chunk, not accumulate the whole thing in a byte[] and write to file. Ditto for when the code is ultimately writing to the file to a BLOB column in the DB.

MatthewMartin
  • 32,326
  • 33
  • 105
  • 164