0

I have a long list of URLs which generate files. I want to download them in parallel using PHP with cURL. But not all of them at once but, say, just 10 at once. Not to slow down the server.

(I could of course start 10 downloads at once using curl_multi_exec, then wait till they are all finished, then start the next 10, and so on. But if of the current 10 downloads, say, 9 have already finished, everything has to wait for the slow last one. Then I'm only downloading 1 file in parallel till the next 10 start. So this would only work if all downloads take about the same time, which is not the case for me.)

Max
  • 496
  • 5
  • 13
  • 3
    so what is wrong in your code, please edit your question, and show us your php code, so we can help you – GNassro Jul 23 '20 at 12:48
  • I do not have any code, I have a question. Please clarify what you do not understand. – Max Jul 23 '20 at 13:11
  • just try it with your self, show us the code and the problem in your code, so we can help you, no one will help you if you did not try with your self. – GNassro Jul 23 '20 at 13:18
  • @GNassro will do! – Max Jul 23 '20 at 13:59
  • You can use threads, but this requires you to recompile php https://www.php.net/manual/es/class.thread.php. Also if you are runing your code in the terminal, you can split your urls in different processes and run them in parallel. – David Rojo Jul 23 '20 at 15:20

0 Answers0