0

I have xenforo and yetishare. In both, when i try to upload files around 256 mb, the progrss bar starts, goes till 150mb + and then the progess bar again comes back to 0% and start uploading again. It does like 4 times then shows error that the file can't be uploaded.

I can upload a file, 126mb without any problem.

My Php Info : http://dl.godgivens.com/temp.php

I have set them temp directory of php to /tmp and uploaded a .htaccess file there with this settings, [these are the only lines in it]

LimitRequestBody 0
php_value upload_max_filesize 0
php_value post_max_size 4939212390

My file uploader, is index.php, and the .htaccess file also has the above settings as well as the folder which the files are saved also have the above settings.

You could see my php.ini settings, link i gave above.

IN httpd.conf under my httpd folder in /etc i have set timeout to 900.

I couldn't find limit request body anywhere.

In php.conf, i have added this to the last section,

<Files *.php>
SetOutputFilter PHP
SetInputFilter PHP
LimitRequestBody 0
</Files>

and i have added this line to fcgid.conf,

FcgidMaxRequestLen 1073741824

So it looks like,

# This is the Apache server configuration file for providing FastCGI support
# via mod_fcgid
#
# Documentation is available at http://fastcgi.coremail.cn/doc.htm

LoadModule fcgid_module modules/mod_fcgid.so

<IfModule mod_fcgid.c>

<IfModule !mod_fastcgi.c>
    AddHandler fcgid-script fcg fcgi fpl
</IfModule>

  FcgidIPCDir /var/run/mod_fcgid/sock
  FcgidProcessTableFile /var/run/mod_fcgid/fcgid_shm

  FcgidIdleTimeout 40
  FcgidProcessLifeTime 30
  FcgidMaxProcesses 20
  FcgidMaxProcessesPerClass 8
  FcgidMinProcessesPerClass 0
  FcgidConnectTimeout 30
  FcgidIOTimeout 45000
  FcgidInitialEnv RAILS_ENV production
  FcgidIdleScanInterval 10
  FcgidMaxRequestLen 1073741824

</IfModule>

I have also added edited it, in domain configuration, nginxdomainvirtualhost.php.

Rebooted servers, reloaded httpd, restarted https, reconfigured things, but still it is just as same as before:/

I want to be able to upload 1gb-2gb files in my forum.

Here is the example of it uploading, Please see the link below to see it uploading. http://gyazo.com/c40cb03c503f172a3bc737d688f7cb00.gif

This is the one I followed : 4GB HTTP File Uploads Using jQuery-File-Upload, Apache and PHP

Community
  • 1
  • 1
kks21199
  • 1,116
  • 2
  • 10
  • 29
  • What about the apache timeout? I saw `max_execution_time` was ok but the tutorial says `apache2.conf` timeout to 900. – Ludovic Guillaume Dec 22 '13 at 10:07
  • Also, did you set `` in the upload form? – Ludovic Guillaume Dec 22 '13 at 10:09
  • Sorry i don't have anything called apache2.conf as i am on centos vps and i have httpd.conf which i edited the file. yes, here is the line, # # Timeout: The number of seconds before receives and sends time out. # Timeout 900 ------------------------------ Mine is written in PHP, and if you think you could help, i can share the index.php with you which has the only permission to upload files. – kks21199 Dec 22 '13 at 10:15
  • StackOverflow rewards others through votes, not money. I've edited your question to exclude offers of payment. – WiredPrairie Dec 22 '13 at 19:47
  • Instead of uploading via PHP, I highly recommend you use another form of transfer, I.E- FTP, SSH etc...Especially if you're going to be uploading several gig's. – RobAtStackOverflow Dec 22 '13 at 20:05

1 Answers1

1

I thing this is because your hosting provider has given upload limit 126...

so you must ask your hosting provider to increase upload limit or you can use client side ftp upload script (flash/java) to handle file upload. But for client side ftp upload you need to modify your yetishare script

  • that was my own vps and i had mentioned it above. without having root access, i coudln't have edit most of the files above. – kks21199 Mar 25 '14 at 20:49