I'm working on an upload tool to upload large files (1GB - 4GB) to a FTP server. I used HTML5 to slice the uploaded file into 1MB chunks and then upload each of those chunks to a temp folder. As soon as every chunk is uploaded a PHP script creates the file out of those chunks:
if (($fp = fopen(UPLOAD_DIR.$fileName, 'w')) !== false) {
for ($i = 1; $i <= $totalFiles; $i++) {
fwrite($fp, file_get_contents($tempDir.'/'.$fileName.'.part'.$i));
}
fclose($fp);
}
Everything works for smaller file sizes (tested with files around 1GB - 1.5GB), but as soon as i upload very large files (e.g. 3GB) my PHP script stops with a 504 Gateway timeout. The 'max_execution_time'
in my php.ini is set to 90. Is there any way to avoid the 504 timeout in this case or a way to speed up the script, which creates the file? I could try to change 'max_execution_time', but that doesn't seem to be the perfect solution. Any suggestions?
EDIT: The script is running on a Linux System