45

I have a web application that accepts file uploads of up to 4 MB. The server side script is PHP and web server is NGINX. Many users have requested to increase this limit drastically to allow upload of video etc.

However there seems to be no easy solution for this problem with PHP. First, on the client side I am looking for something that would allow me to chunk files during transfer. SWFUpload does not seem to do that. I guess I can stream uploads using Java FX (http://blogs.oracle.com/rakeshmenonp/entry/javafx_upload_file) but I can not find any equivalent of request.getInputStream in PHP.

Increasing browser client_post limits or php.ini upload or max_execution times is not really a solution for really large files (~ 1GB) because maybe the browser will time out and think of all those blobs stored in memory.

Is there any way to solve this problem using PHP on server side? I would appreciate your replies.

Bill the Lizard
  • 398,270
  • 210
  • 566
  • 880
rjha94
  • 4,292
  • 3
  • 30
  • 37
  • 3
    One of my applications allows > 1 GB files to be uploaded by configuring the same server-side options you mentioned. Users have never reported time outs or anything of that sort. – Dolph Mar 15 '10 at 14:37
  • 2
    A good solution works everywhere, IMHO. Uploading 1GB files will not work on 56 KBPS modems/ slow connections. A chunking solution would be very robust and can support resuming interrupted links. – rjha94 Jul 27 '12 at 13:30
  • 1
    Take a look here: https://tus.io/ – sgargel Sep 12 '17 at 12:15

8 Answers8

15

plupload is a javascript/php library, and it's quite easy to use and allows chunking.

It uses HTML5 though.

Dean Rather
  • 31,756
  • 15
  • 66
  • 72
  • Worth mentioning that in the mean time it also supports HTML4, Silverlight and Flash, as can be seen [here](http://www.plupload.com/example_all_runtimes.php). – jdepypere Aug 06 '14 at 17:00
4

Take a look at tus protocol which is a HTTP based protocol for resumable file uploads so you can carry on where you left off without re-uploading whole data again in case of any interruptions. This protocol has also been adopted by vimeo from May, 2017.

You can find various implementations of the protocol in different languages here. In your case, you can use its javascript client called uppy and use golang or php based server implementation in a server.

Konsole
  • 3,447
  • 3
  • 29
  • 39
  • Can't find an example plus do it allow setting request headers? – TheRealChx101 Oct 01 '18 at 02:40
  • Yes, you can add request headers using tus middleware: https://github.com/ankitpokhrel/tus-php#middleware you can find example implementation here: https://github.com/ankitpokhrel/tus-php/tree/master/example – Konsole Oct 01 '18 at 03:06
3

"but I can not find any equivalent of request.getInputStream in PHP. "

fopen('php://input'); perhaps?

Bart van Heukelom
  • 43,244
  • 59
  • 186
  • 301
3

I have created a JavaFX client to send large files in chunks of max post size (I am using 2 MB) and a PHP receiver script to assemble the chunks into original file. I am releasing the code under apache license here : http://code.google.com/p/gigaupload/ Feel free to use/modify/distribute.

rjha94
  • 4,292
  • 3
  • 30
  • 37
2

Try using the bigupload script. It is very easy to integrate and can upload up to 2 Gb in chunks. The chunk size is customizable.

Stefan Dorunga
  • 679
  • 6
  • 18
0

How about using a java applet for the uploading and PHP for processing..

You can find an example here for Jupload: http://sourceforge.net/apps/mediawiki/jupload/index.php?title=PHP_Example

Chris
  • 8,168
  • 8
  • 36
  • 51
  • Thanks for the link. That should be possible, I am trying with Java FX right now. JUpload screenshot looks from about 10 years back ;o) – rjha94 Mar 19 '10 at 11:30
  • Haha am not sure man :) but the PHP code is pretty recent.. (few days ago last update) you can see how they do the chunking.. http://jupload.svn.sourceforge.net/viewvc/jupload/trunk/wwwroot/samples.PHP/jupload.php?view=markup – Chris Mar 22 '10 at 08:43
0

you can use this package

it supports resumable chunk upload.

in the examples/js-examples/resumable-chunk-upload example , you can close and re-open the browser and then resume not completed uploads.

Mahdi
  • 187
  • 2
  • 14
-1

You can definitely write a web app that will accept a block of data (even via a POST) then append that block of data to a file. It seems to me that you need some kind of client side app that will take a file and break it up into chunks, then send it to your web service one chunk at a time. However, it seems a lot easier to create an sftp dir, and let clients just sftp up files using some pre-existing client app.

Zak
  • 24,947
  • 11
  • 38
  • 68
  • 2
    You might not want to just append each chunk to the destination file as they come in. It's possible for the chunks to get out of order. The solution we use is to save each chunk with a numerical id, then combine all the chunks once they have all been uploaded. – rodrigo-silveira Jan 23 '13 at 16:55