4

I have the following script, which works good, locally (Windows 10 IIS, windows 2003 Server), but not on our hosting server (Windows 2003 Server). Anything over 4mb will download really slow and then timeout before it gets to the end of the file. However, locally, it downloads fast and full.

Doing a Direct Download (link to the file itself) downloads a 26.5mb file in 5 seconds from our hosting provider server. So, there is not an issue with a download limit. There is an issue it seems, with the hosting server and this script. Any ideas?

Response.AddHeader "content-disposition","filename=" & strfileName
Response.ContentType = "application/x-zip-compressed" 'here your content -type

Dim strFilePath, lSize, lBlocks
Const CHUNK = 2048
set objStream = CreateObject("ADODB.Stream")
objStream.Open
objStream.Type = 1
objStream.LoadFromfile Server.MapPath("up/"&strfileName&"") 
lSize = objStream.Size
Response.AddHeader "Content-Size", lSize
lBlocks = 1
Response.Buffer = False
Do Until objStream.EOS Or Not Response.IsClientConnected
Response.BinaryWrite(objStream.Read(CHUNK))
Loop

objStream.Close
Petter Friberg
  • 21,252
  • 9
  • 60
  • 109
Wayne Barron
  • 304
  • 4
  • 16
  • Probably an IIS configuration issue. I believe the default setting is 4MB... – sgeddes Aug 24 '16 at 19:18
  • As I stated above. You can do a direct download on the file, and it will download large files in no time at all. Just not with the script. – Wayne Barron Aug 24 '16 at 21:19
  • Is that all the code in the page or is that a snippet? Could be something else causing the slow down perhaps? Or what about increasing the chunk size? As an example say it takes 100 milliseconds to read one chunk *(2048 bytes)* that means that a 4 MB file would take approx 3 minutes to download, however increasing the chunk size to say 20480 *(20 KB)* would only take approx. 20 seconds, see the difference? – user692942 Aug 25 '16 at 09:10

3 Answers3

4

Just looking at the code snippet it appear to be fine and is the very approach I would use for downloading large files (especially like the use of Response.IsClientConnected).

However having said that, it's likely the size of the chunks being read in relation to the size of the file.

Very roughly the formula is something like this...

time to read = ((file size / chunk size) * read time) 

So if we use your example of a 4 MB file (4194304 bytes) and say it takes 100 milliseconds to read each chunk then the following applies;

  • Chunk Size of 2048 bytes (2 KB) will take approx. 3 minutes to read.

  • Chunk Size of 20480 bytes (20 KB) will take approx. 20 seconds to read.

Classic ASP pages on IIS 7 and above have a default scriptTimeout of 00:01:30 so in the example above a 4 MB file constantly read at 100 milliseconds in 2 KB chunks would timeout before the script could finish.

Now these are just rough statistics your read time won't constantly stay the same and it's likely faster then 100 milliseconds (depending on disk read speeds) but I think you get the point.

So just try increasing the CHUNK.

Const CHUNK = 20480 'Read in chunks of 20 KB
user692942
  • 16,398
  • 7
  • 76
  • 175
  • You rock. Adding the 0 to the end, did the job. I just tested it on the hosting server, and it downloaded the 26mb files in 1/2 the time of the direct download, and it completed it. About 10 seconds for 26mb. Direct download is about 5 seconds. You think adding another 0 (204800) would be OK to do, or not? – Wayne Barron Aug 25 '16 at 20:05
  • @WayneBarron its a bit of trial and error needed, personally I'd set it up in byte increments, so tried 2 KB, 20 KB so next I'd try 200 KB *(204800 bytes)*. Just keep increasing until you're happy with the throughput. The larger the chunk being read the more resource is needed, so it's a trade off between resource usage and performance. – user692942 Aug 25 '16 at 20:24
  • 1
    I set it to 2048000. This is pulling at about 1.5mb a second download speed. So, that should handle it pretty well. Thank Lankymart, have a good one. – Wayne Barron Aug 25 '16 at 21:44
0

The code I have is bit different, using a For..Next loop instead of Do..Until loop. Not 100% sure this will really work in your case, but worth a try. Here is my version of the code:

For i = 1 To iSz / chunkSize
    If Not Response.IsClientConnected Then Exit For
    Response.BinaryWrite objStream.Read(chunkSize)
Next 
If iSz Mod chunkSize > 0 Then 
    If Response.IsClientConnected Then 
        Response.BinaryWrite objStream.Read(iSz Mod chunkSize)
    End If 
End If
Shadow The GPT Wizard
  • 66,030
  • 26
  • 140
  • 208
  • I've used a `Do While` for this type of chunked downloading in the past without issue. – user692942 Aug 25 '16 at 08:49
  • 1
    It's personal preference, but I find a `Do` loop is a cleaner approach then having an extra bit to read the remainder, also avoids the code duplication on the `Response.BinaryWrite()`. – user692942 Aug 25 '16 at 11:04
  • 1
    In my case, it was the chunk size being read, that was causing my issue, as pointed out in Lankymart's accepted answer above. – Wayne Barron Aug 25 '16 at 20:07
0

Basically is due the script timeout. I had the same problem with 1GB files in IIS 10 after upgraded to Win 2016 with IIS 10 (default timeout is shorter by default).

I use chunks of 256000 and Server.ScriptTimeout = 600 '10 minutes

user2033838
  • 153
  • 1
  • 10