31

I made an upload page in PHP, but I dont know why the page would not upload documents greater than 500MB, This is my first time of trying to upload something that large, I changed all the configurations in the PHP.INI (post_max_size = 700M, upload_max_filesize = 600M, and max_execution_time = 300). The codes for upload is as below

if(isset($_FILES['upload']) && !empty($_FILES['upload']['name'])){
 move_uploaded_file($_FILES['upload']['tmp_name'], $this->filePath.$this->fileName);
}

I need help, I wonder if there is something am not doing right..

George
  • 3,757
  • 9
  • 51
  • 86
  • What are the errors in your server error log? – Jocelyn Apr 19 '13 at 10:48
  • @all, sorry my server was down,the error returned is undefined index name, as if the file upload input field does not exist, or sometimes it says execution timeout, its frustrating me – George Apr 19 '13 at 11:16
  • 1
    Execution timeout would make sense on `move_uploaded_file` due to the copy operation it does. As for the timeout, fear not - transfer of the data to the server is **NOT** included in the timeout. By the time PHP starts, your request has already arrived. – Sébastien Renauld Apr 19 '13 at 14:10

3 Answers3

27

Do you think if increasing upload size limit will solve the problem? what if uploading 2GB file, what's happening then? Do you take into consideration the memory usage of such a script?

Instead, what you need is chunked upload, see here : Handling plupload's chunked uploads on the server-side and here : File uploads; How to utilize "chunking"?

Community
  • 1
  • 1
Twisted1919
  • 2,430
  • 1
  • 19
  • 30
12

By configuration, PHP only allows to upload files up to a certain size. There are lots of articles around the web that explain how to modify this limit. Below are a few of them:

For instance, you can edit your php.ini file and set:

memory_limit = 32M
upload_max_filesize = 24M
post_max_size = 32M

You will then need to restart apache.

Note:
That being said, Uploading large files like that is not very reliable. Errors can occur. You may want to split the files, and include some additional data for error corrections. One way to do that is to use par recovery files. You can then check the files after upload using the par command line utility on unix-like systems.

Jean
  • 7,623
  • 6
  • 43
  • 58
0

I assume you mean that you transferring the files via HTTP. While not quite as bad as FTP, its not a good idea if you can find another of solving the problem. HTTP (and hence the component programs) are optimized around transferring relatively small files around the internet.

While the protocol supports server to client range requests, it does not allow for the reverse operation. Even if the software at either end were unaffected by the volume, the more data you are pushing across the greater the interval during which you could lose the connection. But the biggest problem is that caveat in the last sentence.