0

I have an https API URL which I need to run multiple times, and as quickly as possible to check the status of a domain.

How can I run the same URL multiple times concurrently, while receiving the response in output, and closing the finished connection, whilst also keeping memory resources as low as possible?

I was told to use threading in another language (not PHP)?

Any examples or lending hand would be greatly appreciated! Thanks

moinudin
  • 134,091
  • 45
  • 190
  • 216
iCeR
  • 77
  • 1
  • 6
  • 1
    If the intention is [domain snatching](http://www.quickonlinetips.com/archives/2005/05/how-to-snatch-expiring-domains-seek-the-professionals/), then using one of the established services is a better option. Your script implementation is hardly as important as the actual connection and latency. – mario Jan 02 '11 at 17:15
  • @mario: Cheers. I'm competing against 2 other companies for specific ccTLD's. They are new to the game and they are snapping up those domains in slow time (up to 10 seconds after purge time). I'm just a little slower at the moment. – iCeR Jan 02 '11 at 17:25
  • possible duplicate of [How to run cURL once, checking domain availability in a loop? Help fixing code please](http://stackoverflow.com/questions/4481946/how-to-run-curl-once-checking-domain-availability-in-a-loop-help-fixing-code-pl) – mario Jan 02 '11 at 17:36

2 Answers2

2

Just did a quick google and came up with this, I believe this will solve your problem - this quote sums it up:

Using the curl_multi* family of cURL functions you can make those requests simultaneously. This way your app is as slow as the slowest request, as opposed to the sum of all requests.

m.edmondson
  • 30,382
  • 27
  • 123
  • 206
  • Thanks! Will this fix the issue I have over here: http://stackoverflow.com/questions/4567492/how-to-reduce-virtual-memory-by-optimising-my-php-code, regarding excess memory? – iCeR Jan 02 '11 at 17:16
  • @iCeR - To be honest I have no idea, I'm mainly a .NET programmer - guess I was just lucky with google ;-) – m.edmondson Jan 02 '11 at 17:17
1

m.edmondson was faster than me :D

An maybe helpful information: When doing a lot requests with curl simultaniously, you'll probably get a time lag bcs the curl DNS resolver doesn't work concurrently.

As an alternative idea you could write an php cli script which forks childs who work seperately, or just use bash & lynx (on linux).

Samuel Herzog
  • 3,561
  • 1
  • 22
  • 21
  • Thank you. Do you have an example of running a URL multiple times using any of those methods? – iCeR Jan 02 '11 at 17:23
  • i built an server observation tool once and used exactly the same blogpost as base which m.edmondson provided. – Samuel Herzog Jan 02 '11 at 17:27
  • so there is not really anything valueable to add. just a thing to mention: i didn't see a `curl_multi_remove_handle()` in your script, maybe this would reduce the memory a little bit. – Samuel Herzog Jan 02 '11 at 17:27
  • Even adding it doesn't help. With curl_multi it still exceeds 100mb memory limit when loading up more than 90 url's. – iCeR Jan 02 '11 at 17:43
  • I don't have any experience optimizing a curl script yet, sadly. – Samuel Herzog Jan 02 '11 at 17:50