-3

I need to upload the bigger file in xampp though php using $_FILES[] and move_uploaded_file() functions. The file below 1GB were uploaded successfully. See Snapshot but the file above 1 GB could not be uploaded. I re-changed the related variables in 'php.ini' file but not able to do that. The browser says: 'The connection was reset.' and page disconnects See Next Snapshot. I changed the variables as follows:

post_max_size = 11000M
upload_max_filesize = 10000M
max_execution_time = 36000
max_input_time = 36000
memory_limit = 15000M

How can I able to upload more than 1GB file through php in my localhost of my Xampp Server?

  • You'll probably need to modify some web server settings as well - for example the `LimitRequestBody` directive of the apache configuration is very relevant to this. – CBroe Aug 25 '23 at 09:37
  • Please clarify your specific problem or provide additional details to highlight exactly what you need. As it's currently written, it's hard to tell exactly what you're asking. – Community Aug 25 '23 at 11:35
  • Are you sure that PHP is throwing that error? If not, this question might be more on-topic on serverfault.com – Nico Haase Aug 25 '23 at 12:14
  • No error in php code. If there was an error, file was not uploaded below 1 GB. – mr.biratpdl Aug 25 '23 at 12:19
  • Why is this tagged with PHP then, if PHP is not to blame? – Nico Haase Aug 25 '23 at 12:33

1 Answers1

-1

As many people said in the comments, You most likely exceeded one or more of the restrictions set in your xampp stack configuration. Either you'll have to change your php settings as you mentioned or you'll have to update the web server settings (apache or nginx settings). There are many config variations you need to check (the list is not complete):

How PHP is configured:

  • If it's FPM, you should check your PHP-FPM configuration
    • Check for and set the connection timeout between the web server and the PHP process (because there is one) - it's in the nginx php-fpm settings for that particular site
  • If PHP is loaded as an Apache module the configuration is completely different (you need to check apache config or virtual host config)
  • FastCGI has yet another set of configuration (most likely global, but it could be site specific)

If you're interested in a solution that doesn't involve tinkering with the technology stack configuration, there is one: chunked uploads.

Uploading big files in one go is generally not a good idea. You should consider doing it in chunks instead. What happens, if a user has to wait 30-ish minutes for an upload to finish and fail at the last minute? While chunked uploads take just as much, you'll have better control and more chances to get it done.

It has the following advantages over monolithic (at once) uploads:

  • Fault tolerance can be implemented:
    • When a chunk fails, you won't have to re-upload the whole file, just the failed chunk
    • When a chunk fails, you can silently repeat it, without bothering the user with it (Vastly superior UX)
  • possibility to adapt it to server restrictions, such as maximum post size, memory limit and other settings (chunk sizes can be set to fit those restrictions), facilitating uploads of files of any size.
  • Possibility to provide a progress bar (which is updated when a chunk is successfully uploaded)
  • can be executed asynchronously:
    • could be delegated to a worker thread
    • multiple chunks could be uploaded at the same time (the server side should be able to handle that: when all chunks are successfully uploaded, do the merge)

Notes:

  • This answer won't contain code as chunked uploads are not trivial to implement. Also plenty of libraries/implementations are available. Most of them are opinionated, I let you choose the best fitting (doing a google search for chunked uploads library with Javascript/PHP will give you plenty of results).

  • The very basics are available here, but that in itself is not enough. AJAX/PHP - Upload a large file in segments in pure javascript. The answer to the linked question doesn't contain mitigation for failed chunks and other errors, it just shows the generic idea.


So what you need to do is:

Client side:

  • when the file is selected, break it up into small chunks (~500KB - 5MB in size)
  • go through the chunks and post them one by one via AJAX/XMLHttpRequest
  • if any of the requests fail, repeat the upload for the respective chunk for a reasonable number of times (error tolerance)
  • do it until you're out of chunks
  • the steps presented here don't include security (validation, etc.), but that should never be forgotten.

Server side

  • your endpoint(s) should be able to handle:
    • upload start/first chunk
    • upload end/last chunk
    • rest of the chunks
  • at first chunk make a manifest somewhere (file/database) which stores your current upload state:
    • number of total chunks,
    • successfully uploaded chunks (with their order),
    • current chunk
    • what file/upload session the chunk belongs to (to prevent mixing it with chunks from other uploads)
  • at last chunk concatenate the chunks in the correct order and save them into a file
  • update state at every chunk after the upload of the specific chunk is successful
  • provide proper error output for any failure (you'll have to rely on the error info if any of the chunks fail. You can provide meaningful error messages to the user and/or you can act based on the error)
beerwin
  • 9,813
  • 6
  • 42
  • 57