I'm using Laravel to upload multiple files into an S3 bucket but can't get the progress data correctly on client side.
On client side I have the following (simplified):
var xhr = new XMLHttpRequest();
xhr.addEventListener("progress", updateProgress, false);
xhr.addEventListener("load", transferComplete, false);
xhr.open("POST", "my_url_to_upload");
xhr.send(formData);//formData is defined earlier and contains multiple files
function updateProgress(e) {
console.log('updateProgress', e);
}
function transferComplete(e) {
console.log("Files uploaded successfully.", e);
}
On Laravel's side:
$files = \Input::file('file');
$s3 = \Storage::disk('s3');
foreach ($files as $file)
{
$file_name = "/some_folder/" . $file->getClientOriginalName();
$s3->put($file_name, file_get_contents($file), 'public');
}
This works great as far as uploading the files to the S3 bucket.
The problem is that when uploading multiple files, the client side updateProgress
function is only called once and only when all files have been uploaded (instead of after each file and during the uploads).
Ideally, the progress bar will get updated periodically during file uploads, so that when uploading large files, it will show close to real time progress (and not just when each file is completed).
How would I get Laravel (or PHP in general) to report back the progress, during the uploads?