I implemented a code with Laravel to handle chunk uploading, which can be seen below.
// if total size equals length of file we have gathered all patch files
if ($size == $length) {
// write patches to file
foreach ($patch as $filename) {
// get offset from filename
list($dir, $offset) = explode('.patch.', $filename, 2);
// read patch and close
$patch_contents = Storage::disk('tmp')->get($filename);
// apply patch
Storage::disk('tmp')->append($dir . $name, $patch_contents, "");
}
// remove patches
foreach ($patch as $filename) {
Storage::disk('tmp')->delete($filename);
}
}
The problem is that the following error occurs for large files.
"Allowed memory size of 134217728 bytes exhausted (tried to allocate 160000008 bytes)"
I know that the error is related to the append method. I solved the problem according to the solution in this link and as follows.
// if total size equals length of file we have gathered all patch files
if ($size == $length) {
$time_limit = ini_get('max_execution_time');
$memory_limit = ini_get('memory_limit');
set_time_limit(0);
ini_set('memory_limit', '-1');
// write patches to file
foreach ($patch as $filename) {
// get offset from filename
list($dir, $offset) = explode('.patch.', $filename, 2);
// read patch and close
$patch_contents = Storage::disk('tmp')->get($filename);
// apply patch
Storage::disk('tmp')->append($dir . $name, $patch_contents, "");
}
// remove patches
foreach ($patch as $filename) {
Storage::disk('tmp')->delete($filename);
}
set_time_limit($time_limit);
ini_set('memory_limit', $memory_limit);
}
But I don't have a good feeling about this solution! My question is,
- first of all, why does the append method cause such an error?
- Is the solution appropriate?
- On the other hand, what is Laravel's solution to this problem?