0

I have a file with a size of approx 1.6 GB. I am aware of storing large file in SQL Server. But I have a requirement to generate the blob for this large file.

I am using following code to generate the byte[] for the large file:

string filePath = Server.MapPath(filename);
string filename = Path.GetFileName(filePath);

FileStream fs = new FileStream(filePath, FileMode.Open, FileAccess.Read);
BinaryReader br = new BinaryReader(fs);

Byte[] bytes = br.ReadBytes((int)fs.Length);

br.Close();
fs.Close();

It is throwing

An exception of type 'System.OutOfMemoryException' occurred in mscorlib.dll but was not handled in user code

I want to know that How can I convert this large file to byte[]?

Community
  • 1
  • 1
  • 3
    you are doing `(int)fs.Length` instead read chunks of data like `int chunk = 1024;` and then use that in `br.ReadBytes(chunk ...` i don't know the right syntax but I hope you get the gist – Miguel Sanchez Oct 13 '16 at 05:20
  • 1
    Exactly - you can't store file that big in memory apparently. Read it in chunks. – Paweł Hemperek Oct 13 '16 at 05:22
  • Are you compiling to 32bit or 64bit? Either way, you are pushing the boundary of .Net to create an object that big. Other item is, are you just storing a blob for one file, or lots? SQL will fall over with cells that big and it will be slow to store and retrieve. – SledgeHammer Oct 13 '16 at 05:24
  • I am storing the blob for 1 file. The size of 1 file is 1.6GB – Trimantra Software Solution Oct 13 '16 at 05:28
  • @TrimantraSoftwareSolution is this file size going to grow with time? – Miguel Sanchez Oct 13 '16 at 05:29
  • @MiguelSanchez No this is only 1 time conversion. This byte[] will be used for further processing – Trimantra Software Solution Oct 13 '16 at 05:30
  • @TrimantraSoftwareSolution does it really need to be in the `DB` then why not process it on the filesystem? – Miguel Sanchez Oct 13 '16 at 05:32
  • @MiguelSanchez The problem is that It actually need to be in DB or in memory. because I am working with SharePoint where I have to upload this file to SharePoint Document Library programmatically. So I need it in byte[] in db. – Trimantra Software Solution Oct 13 '16 at 05:33
  • @TrimantraSoftwareSolution don't know much about `SharePoint` but if you have to stick to `DB` then why not store it in chunks intead of 1 big blog you can store it in 10 chunks of 160MB each – Miguel Sanchez Oct 13 '16 at 05:36
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/125556/discussion-between-trimantra-software-solution-and-miguel-sanchez). – Trimantra Software Solution Oct 13 '16 at 05:38
  • 1
    @TrimantraSoftwareSolution another idea would be a `MemoryStream` stream small chunks to memory then do your processing and send to `SharePoint` I'm sure it would have such an interface – Miguel Sanchez Oct 13 '16 at 05:46
  • Check the article, suggested in this answer: http://stackoverflow.com/a/8760195/360171 – Yuriy Rozhovetskiy Oct 13 '16 at 06:48

2 Answers2

2

You SHOULD not do that, either read the file using chunks or in you case if you want to upload it to SharePoint, just connect the two streams together and SharePoint library will do the rest, ex:

FileStream fileStream = File.OpenRead(filename);
SPFile spfile = theLibrary.Files.Add(fileName, fileStream, true);

This is done in SharePoint server side object model, the same can be done with CSOM

     Microsoft.SharePoint.Client.File.SaveBinaryDirect(clientContext, fileUrl, fileStream, true);
Haitham Shaddad
  • 4,336
  • 2
  • 14
  • 19
  • I agree with you @Haitham Shaddad. But I am working with an requirement where I need to Store large blobs in DB first. My client has already blobs stored in SQL tables but due to data security he cannot give me that blobs so I have to generate my own. – Trimantra Software Solution Oct 13 '16 at 05:57
  • If the blobs are stored in the database, then read it also as a stream and use the same method to upload it to SharePoint, pleas tell us exactly your requirement and your environment (SQL or Oracle?) – Haitham Shaddad Oct 13 '16 at 06:20
  • It seems that you are aware of SharePoint. The exact requirement is to migrate the data from SQL table to SharePoint Document library. My client already have large blobs in SQL table. I have already provided a Program which do the same task. But in case of large file my migration program throws an error. So for testing I am generating dummy blobs with size 1.6GB. So that I can test my code. For that reason I want 1.6 GB to be stored in SQL table. – Trimantra Software Solution Oct 13 '16 at 06:27
  • Are the blobs stored as File Stream Blob in SQL or the traditional way? You can simply read the Blobs from the database in byte array chunks, write it directly to File System then upload these files to a document library either using the explorer view or using any tool – Haitham Shaddad Oct 13 '16 at 06:31
  • blobs are stored in VARBINARY(MAX) field as a byte – Trimantra Software Solution Oct 13 '16 at 06:36
  • Read this article, it explains how to stream varbinary from database to files http://www.codeproject.com/Articles/140713/Download-and-Upload-Images-from-SQL-Server-via-ASP – Haitham Shaddad Oct 13 '16 at 06:56
1

Basically, you can't, not with a byte array.

You can do it for objects larger than 2GB, if you are:

1) Running in 64bit mode on a 64 bit system. 32bit apps cannot address memory bigger than 1.5GB.

2) Are running .NET Framework V4.5 or greater. And

3) Have set gcAllowVeryLargeObjects in your app.config: gcAllowVeryLargeObjects Element

But ... array indexers are still limited to an integer - so you can't do this with byte arrays as your array index would be too big.

You can do it with a stream (~8TB is allowed) but not with an array.

Sunil Kumar
  • 3,142
  • 1
  • 19
  • 33