0

I have a PHP form which takes a selected zip file on submit and uses AJAX to post to an Amazon S3 bucket. This process goes great when I test 10MB or less zip files, but when I test something over 500MB I get the following error:

Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.

I need to be able to allow up to 5GB zip files to be uploaded. At this point I need a better understanding of what I need to do to allow the users to upload large files. Any help is much appreciated.

PS: I will be happy to add code snippets, but I am looking to get a response to show the necessary snippets rather than posting a bunch of unrelated code.

codacopia
  • 2,369
  • 9
  • 39
  • 58
  • Perhaps [this question](http://stackoverflow.com/questions/18884683/browser-uploads-to-s3-with-instance-roles) will be helpful to you. It discusses uploads directly to S3 but with authorisation done in PHP. – Ja͢ck Feb 12 '15 at 00:36

2 Answers2

0

Perhaps you should increase the post_max_size variable in the php.ini file (or use ini_set()) which is set at 10MB by default if i remember well. You should also have a look on the upload_max_filesize variable and max_execution_time because i guess it takes a long time to upload 5GB...

If your issue is JQuery/Ajax related, may be you should consider using a FormData object.

Sending multipart/formdata with jQuery.ajax

https://developer.mozilla.org/en-US/docs/Web/API/FormData

Community
  • 1
  • 1
berthni
  • 192
  • 12
  • Thank you, I will look into these options and see if they offer the solution. – codacopia Feb 11 '15 at 22:17
  • The only thing that concerns me about this is the way I am using AJAX to bypass our servers with the upload... Either way I will test this and update the question with my results. – codacopia Feb 11 '15 at 22:23
  • I have attempted 15MB and 30MB packages and they work great. I will research these other links and see if that opens up any options to test. – codacopia Feb 12 '15 at 14:53
0

OK, in the end I had to use the Amazon SDK for chunking large files. This breaks up the file into multiple pieces and therefore prevents the server timeout.

codacopia
  • 2,369
  • 9
  • 39
  • 58