4

Im working on script that integrates online shops.

I have code like this (simplified):

// $asImageurls - array of string with image url's

foreach($asImageurls as $sImageUrl)
{
$imageContent = @file_get_contents($image);
    // create filename, save image etc.
}

Connecting with remote server, downloading image takes a lot of time, and this is not good when I have like 500 products to import.

I was thinking about some parallel downloading, but I don't know how to start.

What can I do, to make it faster?

Kamil
  • 13,363
  • 24
  • 88
  • 183
  • I would try to make it asynchoneous, but I don't have enough knowledge in PhP to even know if that's possible... – Laurent S. May 22 '13 at 16:54
  • 2
    http://stackoverflow.com/questions/9308779/php-parallel-curl-requests should help – Clive May 22 '13 at 16:54
  • @Clive i didnt realized that CURL supports parallel downloads. Maybe you should add answer. – Kamil May 22 '13 at 16:57
  • possibly using the sftp? http://www.php.net/manual/en/wrappers.ssh2.php – Orangepill May 22 '13 at 16:59
  • `curl_multi_init` would work just fine ... – Baba May 22 '13 at 16:59
  • What about `curl_multi_init` compatibility with PHP 5.3? Will it blend? :D – Kamil May 22 '13 at 17:01
  • 1
    @Kamil I'm using it on 5.3 and 5.4 without any issues... – Clive May 22 '13 at 17:02
  • @Clive Thanks. You may add some answer, i have to accept something. – Kamil May 22 '13 at 17:04
  • @Kamil To be honest all I could do is reproduce the code from the other question. If you look at the body of the question it's pretty much identical, I think closing this one as a duplicate to point others to the (excellent) other answer would be best :) – Clive May 22 '13 at 17:05
  • @Kamil would like to use `Threads` .... I think i might have another nice solution – Baba May 22 '13 at 17:18

1 Answers1

0

There are two main solutions to this problem:

1) Instead of downloading your images directly, store all urls in a file (also eventually the destination path). Then, use cron to call a script every n minutes that will do the downloads for you. This is the best way to avoid server overload if a lot of people submit downloads at the same time.

2) Use the exec() PHP function. This way you could call every system command you want. Typically curl in your case. This way you can add & at the end to throw it in background. You can even store warnings and errors redirecting them to a file.

ffarquet
  • 1,263
  • 12
  • 27