0

I have a SQL Server table with a Varbinary(Max) column that is basically compressed data.

My page allows users to download this data (after usual user authentication).

It used to work ok, with smaller data size, but now with time, the data is also getting bigger. I am facing lot of problems basically a wait time, before the Save dialog appears.

Code:

 while (reader.Read())
            {
                Response.Buffer = false;
                Response.Charset = "";
                Response.Cache.SetCacheability(HttpCacheability.NoCache);
                Response.ContentType = "application/gzip";
                Response.AddHeader("content-disposition", "attachment;filename="
                + "vbet_1_1.sdf.gz");
                byte[] bytes = (Byte[])reader["backupdata"]; // STUCK HERE
                Response.BinaryWrite(bytes);
                Response.Flush();
                Response.End();
            }

In the debugger, I can see that

byte[] bytes = (Byte[])reader["backupdata"];

is where that lag is.

My platform is ASP.Net with .NET Framework 4.0, SQL Server 2008, C# codebehind

Kiquenet
  • 14,494
  • 35
  • 148
  • 243
arvind
  • 1,385
  • 1
  • 13
  • 21
  • 1
    I would suggest to move that data to file system, are there any reason you store data in db? please refer this url for more details http://stackoverflow.com/questions/3748/storing-images-in-db-yea-or-nay – Arsen Mkrtchyan Oct 22 '13 at 11:31
  • i really really wish that, even suggested, but things here take time to change – arvind Oct 22 '13 at 11:33

2 Answers2

1

You need to stream the response back. Reading the entire file in memory and then writing it out is not going to scale and you'll start exhasting the server as the number of requests or the size of the files increase.

Have a look at Download and Upload images from SQL Server via ASP.Net MVC and FILESTREAM MVC: Download and Upload images from SQL Server to see an example of how to do this. As you do not use MVC but straight ASP.NEt you can do it simpler, but the ideas are the same:

Remus Rusanu
  • 288,378
  • 40
  • 442
  • 569
  • yes, i m reading and collecting all information. Thanks, is there a code snippet available ? (the first example looks like a library implementation .. will take time to study) – arvind Oct 22 '13 at 11:48
1

This is may be helpful:

Response.AppendHeader("Content-Type", "application/gzip ");
Response.OutputStream.Write((Byte[])reader["backupdata"],0,((Byte[])reader["backupdata"]).Length); 

And you can use filestream tables from sql2012 to directly storage files

Georgy Grigoryev
  • 822
  • 2
  • 8
  • 26