0

I need to upload and read large file (size more than 500 MB). I have a column file with datatype varbinary(MAX) in SQL Server to store.

I want to upload and read file using C#. I need to pass byte array to store file in SQL Server database. But I am getting error while initialize byte while uploading large file.

I can upload file of size 250MB. But I tried to upload file more than 500 MB, it throws "Out of Memory" exception while initializing byte array.

 byte[] fileByte = new byte[file.ContentLength]; // "Out of Memory" exception
 file.InputStream.Read(fileByte, 0, fileByte.Length);

I also tried

 BinaryReader br = new BinaryReader(inputStream);
 byte[] fileByte = br.ReadBytes(file.ContentLength); // "Out of Memory" exception

I don't want to upload and download file in chunk for file safety. Please let me know if anyone help me to upload large file OR Way to fix "Out of memory exception".

Thanks.

  • Set `Response.BufferOutput = false;` to prevent buffering. Possible dupes: https://stackoverflow.com/questions/2809514/outofmemoryexception-when-i-read-500mb-filestream & https://stackoverflow.com/questions/3362849/outofmemoryexception-when-send-big-file-500mb-using-filestream-aspnet. – Tetsuya Yamamoto Aug 30 '17 at 04:38
  • I wouldn't recommend saving files in a database table. Relational databases can store binary data, but won't be efficient. Consider saving the files to a filestream/ filetable if your version supports it. Otherwise, storing on a regular file server with it's path stored in the table may also be enough – Nayak Aug 30 '17 at 05:40
  • I have used filestream/ filetable with SQL Server 2012. But I got an issue to read file from filetable. I need to convert filestream to byte array while reading file into C# from the database. At that time I am getting "Out of Memory" exception. – Dharmesh Parekh Aug 30 '17 at 06:09
  • 32 bit application? 500mb sounds awfully small to get an out of memory. is there a reason you do not use SQ LServer FIlestream (note that this is NOT the same as C# filestream https://learn.microsoft.com/en-us/sql/relational-databases/blob/filestream-sql-server?view=sql-server-ver15 basically stores the binary column transparently in the file system. It also has SQL functions allowing you to get the file name so you can read them directly. – TomTom Jun 18 '20 at 22:45

1 Answers1

0

SQL server can read the file directly from the filesystem if the SP runs under someone with the bulkadmin server privilege. This bulk command worked for us:

update someTable
set FileData = (select * from openrowset(bulk ''' + @filepath + ''', single_blob) as blob)
where id = 1;

The Share directory requires proper permissions and we've had problems for all cases where it wasn't physically on the DB server.

crokusek
  • 5,345
  • 3
  • 43
  • 61