58

Is there anyway to upload a file more than 2 GB, using simple html form upload? Previously I have been uploading large files through silverlight using chunking (dividing a large file into segments and then uploading segments one by one & then reassemble segments at server).

Now, we have a requirement that we just have to use simple html (though GWT) form uploads. Please guide me if there is any way to achieve large file upload this way.

If it is impossible to do it using simple html, can anyone guide me about how I can divide & upload a file in segments using flex?

BalusC
  • 1,082,665
  • 372
  • 3,610
  • 3,555
Nadeem Ullah
  • 907
  • 1
  • 9
  • 16

3 Answers3

56

Use HTML5 File API to upload large files. which has the concept of slicing, so that you can upload large files.

var reader= new FileReader();
reader.onload=function(e){

//do whatever you want with result
}
var blob = file.slice(startingByte, endindByte);//slicing file
reader.readAsBinaryString(blob);

FileSystem Tutorial

File API tutorial

Tiny
  • 27,221
  • 105
  • 339
  • 599
kongaraju
  • 9,344
  • 11
  • 55
  • 78
56

The limitation of the size of HTTP POST requests is usually not in the HTML side at all. The limitation is more in the server side. The webserver needs to be configured to accept that large POST requests. The default is usually indeed often 2GB and the server will usually return a HTTP 500 error on that. The default limit can often be increased to 4GB, but anything beyond that will hit the border on 32bit systems. On 64bit systems with a 64bit OS, the theoretical border is much higher, 16EB.

If configuring the webserver to accept that large POST requests is not an option, or when you want to go beyond the webserver's limit, then you have no other option than splitting the file in the client side and reassembling the parts in the server side.

Since HTML is just a markup language, it offers no facilities for splitting the file. You really have to use a normal programming language like C# (Silverlight) or Java (Applet) in flavor of a small application which you serve by your webpage. Very maybe it's also possible with Flash or Flex, but don't pin me on that since I do neither.

Said that, FTP is a much better choice than HTTP for transferring (large) files over network. I'd reconsider the choice of using HTTP for that.

BalusC
  • 1,082,665
  • 372
  • 3,610
  • 3,555
  • 4
    Thanks BalusC, We can increase server limit upto 4GB, but IE & FF don't support sending files larger than 2GB. I will look for flash / flex to either break the file or compress it before uploading. Thanks, Nadeem – Nadeem Ullah Feb 19 '11 at 21:19
  • Then you really have to write an application which does the job. HTML is a markup language, not a programming language. If the webserver limit is 4GB, then you can go that far in an application without the need to break in parts. I would by the way not compress it, that's only unnecessary overhead and may end up in saving little to nothing on binary files. – BalusC Feb 19 '11 at 21:22
  • @BalusC, Do you mean that browsers impose no limits? And with a powerful server (read drop), we can [upload TB range](https://torrentfreak.com/5-torrent-files-that-broke-mind-boggling-records-101107/) files? – Pacerier Mar 16 '15 at 13:39
  • very odd limit since servers like webpieces can 'stream' through the file in 5k chunks to not eat up RAM writing it to some backend cloud storage or whatnot as the large file comes in. this must be legacy webservers or something. – Dean Hiller Apr 14 '20 at 21:53
  • @Pacerier sky is the limit except at some point, it may be faster to fly by plane with hard drive (we had that happen once actually...just do the math on the size and how long). ALSO, you have to worry about what to do on a disconnect. the browser may start all over again which can be very very bad. I am hoping html 5 has a solution for not starting all over on large files these days. – Dean Hiller Apr 14 '20 at 21:54
2

we created a webapplication (https) (using django/python) which uploads bulk files in to sqlserver database. For this we read the file in chunks, transferred to server through sftp and performed bulk inserts into sqlserver. We benchmarked it for 1 gb filesize and end to process was over in around 1 minute or so.

Sudhakar Chavan
  • 377
  • 4
  • 14