0

I want to download a large holly file with PHP, but due to the use of shared hosting, sometimes I encounter a TIMEOUT error and the file download remains incomplete, I want if the download is stopped for any reason and the script again I called the file to start downloading from the last point where the download was stopped and join the previous file, however I do not want the file to be defective after the download is complete.

My simple code:

<؟php
set_time_limit(0);
file_put_contents("Solidworks.Premium_2021_SP1_Windows.part1.rar", fopen("https://dl1.wikishare.ir/sdlftpuser02/Category99/Engineering/Solidworks/Solidworks.Premium_2021_SP1_Windows.part1.rar", 'r'));
?>

Any guidance and help will be greatly appreciated.

  • Use `curl` with the `Range:` header to specify where to start the download. – Barmar May 20 '22 at 15:32
  • I don't know where is the problem of the curl download speed is extremely low, but when I use file_put_contents the download speed is very high – Mehdi Karimi May 20 '22 at 15:43
  • You could also do it with `fopen()` using `stream_context_create()` to add the header. – Barmar May 20 '22 at 15:46
  • I'm not sure why there would be a speed difference. The bottleneck is the network, not the code that reads from it. – Barmar May 20 '22 at 15:46
  • See https://stackoverflow.com/questions/7967531/php-curl-read-remote-file-and-write-contents-to-local-file for how to have curl write directly to the file, rather than reading everything into memory first. That may be the problem with an enormous file like this. – Barmar May 20 '22 at 15:48

0 Answers0