1

Right now I am using cURL and something like that:

foreach ($urls as $url) {

    $ch = curl_init(); 
    curl_setopt($ch, CURLOPT_URL, $url ); 
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); 
    curl_exec($ch);

}

to trigger a php script on a remote server.

My thing is I just want to trigger that script and doesn't care at all what it returns and I want to trigger another address that is in my loop.

So, how can I eliminate waiting for the response and just trigger the scripts on the servers (I have about 200 urls in my array I need to loop through and trigger each of these urls).

So, basically I want just to trigger the script and move to the next one and don't care what it returns.

And another concern of mine is that if I can move the curl_init() outside of the loop like that:

$ch = curl_init(); 

foreach ($urls as $url) {

    curl_setopt($ch, CURLOPT_URL, $url ); 
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); 
    curl_exec($ch);

}

If there is a faster way how to achieve this without using cURL, please let me know. I just need to trigger 100 scripts on remote servers from one loop inside one file.

John Doerthy
  • 445
  • 5
  • 14

3 Answers3

3
<?php
$fp = fsockopen("mesela.dom", 80, $errno, $errstr, 30);
if (!$fp) {
    echo "$errstr ($errno)<br />\n";
} else {
    $out = "GET / HTTP/1.1\r\n";
    $out .= "Host: mesela.dom\r\n";
    $out .= "Connection: Close\r\n\r\n";
    fwrite($fp, $out);
    fclose($fp);
}
?>
Murat Cem YALIN
  • 317
  • 1
  • 6
  • Thanks, can you edit this code using my loop? e.g. `foreach ($urls as $url) { }` ? Is it faster than cURL? Will I be able to reach 100 urls in 10 seconds using this function? Do I need if else condition at all? can it be somehow trimmed so it is faster to execute? – John Doerthy Dec 09 '15 at 09:31
  • Technically yes but in practice... You should try...but it also depends on internet speed and ur server's specs – Murat Cem YALIN Dec 09 '15 at 09:33
  • Also u can use multithreading to speed up things. Here is the link http://php.net/manual/en/intro.pthreads.php – Murat Cem YALIN Dec 09 '15 at 09:35
  • How much is it fater than curl. Why you think it's faster? I cannot find any info where it says taht its faster tahn curl. Please, provide some info. – John Doerthy Dec 09 '15 at 09:38
  • I cant see any example on that page how to establish a multithreaded connection. It's just some mumbo jumbo without any example. – John Doerthy Dec 09 '15 at 09:40
  • i am replying from mobile so i can not give you much answers via writing code and testing but u can try that on your own it is like 10 lines of code. Code it and test it – Murat Cem YALIN Dec 09 '15 at 09:40
  • Don't get me wrong, I appreciate your help, but I am not an advanced php user, so, you need to talk with examples ;). Thanks anyway, my friend. – John Doerthy Dec 09 '15 at 09:41
  • You are talking like o have to prove something to you but sorry no i show you the way and u should do the rest – Murat Cem YALIN Dec 09 '15 at 09:42
0

You can use a queue system with your code adding these URLs to be called as jobs and multiple workers doing the CURL calls.

This will make your code asynchronous ( not wait for response from the curl call).

A good PHP library you can use https://github.com/chrisboulton/php-resque

  • I am talking about 24h nonstop impors of actual information. Queues are not an option for me. It has to be real time, sorry ;( Btw. what is Redis? – John Doerthy Dec 09 '15 at 09:35
  • Redis is a queuehandler. You can think of it as a database (very easy explained). – DasSaffe Dec 09 '15 at 09:46
  • hi , yes you can have 24 hours non stop import of information using the system. Your PHP file keeps adding the jobs in queue and your workers keep doing the curl calls, adding contents to DB etc. when there are no jobs then the workers go to sleep and wake up when there is work to be done. – Varun Bhatia Dec 09 '15 at 10:16
  • It's on hundreds of remote websites, not on one server. Not a chance to install programs on the Linux server to manage queues etc. But thanks anyway. Asynchronous multi curl seems to be working fine. – John Doerthy Dec 09 '15 at 13:57
0

curl multi

<?php
    function setcurloptions( $handle=false, $url=false, $cacert=false ){
        if( $handle && $url ){
            if( parse_url( $url, PHP_URL_SCHEME )=='https' ){
                curl_setopt( $handle, CURLOPT_SSL_VERIFYPEER, FALSE );
                curl_setopt( $handle, CURLOPT_SSL_VERIFYHOST, 2 );
                curl_setopt( $handle, CURLOPT_CAINFO, realpath( $cacert ) );
            }
            curl_setopt( $handle, CURLOPT_URL, $url );
            curl_setopt( $handle, CURLOPT_HEADER, false );
            curl_setopt( $handle, CURLOPT_FRESH_CONNECT, true );
            curl_setopt( $handle, CURLOPT_FORBID_REUSE, true );
            curl_setopt( $handle, CURLOPT_HTTP_VERSION, CURL_HTTP_VERSION_1_1 );
            curl_setopt( $handle, CURLOPT_CLOSEPOLICY, CURLCLOSEPOLICY_OLDEST );
            curl_setopt( $handle, CURLOPT_BINARYTRANSFER, true );
            curl_setopt( $handle, CURLOPT_AUTOREFERER, true );
            curl_setopt( $handle, CURLOPT_CONNECTTIMEOUT, 30 );
            curl_setopt( $handle, CURLOPT_RETURNTRANSFER, true );
            curl_setopt( $handle, CURLOPT_USERAGENT, $_SERVER['HTTP_USER_AGENT'] );
        }
    }


    $cacert='c:/wwwroot/cacert.pem';
    $urls=array(
        'http://www.example.com',
        'http://www.example.com',
        'http://www.example.com'
    );

    $multi = curl_multi_init();
    $handles = array();


    foreach( $urls as $i => $url ){
        $handle = curl_init();
        setcurloptions( $handle, $url, $cacert );
        curl_multi_add_handle( $multi, $handle );
        $handles[]=$handle;
    }


    $active=null;
    do {
        $mrc = curl_multi_exec( $multi, $active );
        usleep(100);
    } while( $mrc == CURLM_CALL_MULTI_PERFORM );


    while( $active && $mrc == CURLM_OK ) {
        if( curl_multi_select( $multi ) != -1 ) {
            do {
                $mrc = curl_multi_exec( $multi, $active );
            } while( $mrc == CURLM_CALL_MULTI_PERFORM );
        }
        usleep( 100 );
    }


    foreach( $handles as $i => $handle ){
        /*
        if you want to do something at all with response 
        $response=curl_multi_getcontent( $handle );
        */
        curl_multi_remove_handle( $multi, $handle );
        curl_close( $handle );
        usleep(100);
    }



    curl_multi_close( $multi );
?>
Professor Abronsius
  • 33,063
  • 5
  • 32
  • 46
  • wow, thanks. ;) looks great, I will try it now. Btw. how much faster is it than if using a standard php loop, like I have done so far? – John Doerthy Dec 09 '15 at 09:46
  • Could I ask what is `$cacert` ? Do I need it or is it something from your specific script and I can ignore it? – John Doerthy Dec 09 '15 at 09:47
  • `$cacert` is the widely available certificate bundle - if any of the urls are using `https` then curl will often falter if the correct settings in the curl request are not present. Search Google and you will find! AFAIK, curl multi runs the requests in parallel - so they should all be doing their own thing at the same time - thus hopefully saving time. – Professor Abronsius Dec 09 '15 at 09:50
  • OK, so I can ignore that certificate thing then? I am not using https for these urls. And I would like to know why are you using usleep(100) ? Btw. is all this code and settings really necessary? When I look here: http://arguments.callee.info/2010/02/21/multiple-curl-requests-with-php/ it's much smaller. I need a small script that's as fast as possible. What is that foreach loop for $handles . Do I need that if I just want to trigger that urls and don't care about what they return? What is doing that `foreach ( $handles as $i => handle )` loop? Thank you. – John Doerthy Dec 09 '15 at 10:07
  • Could you answer my questions, please? And even update your code? Seems like a lot stuff is something from your specific web page. Thank you in advance. – John Doerthy Dec 09 '15 at 13:56
  • Believe it or not I don't spend all day at the laptop. To answer your questions: Ignore the certificate parts if not relevant, usleep can be reduced or removed if you feel it is going to impact your app ( very small numbers btw ). Yes, I dare say the curl can be reduced - this worked for me in a situation I needed. The Foreach loop for handles - leave if you want to clear up. – Professor Abronsius Dec 09 '15 at 14:00
  • Thank you. But why you need usleep(100) in your loops? Why did you decide to use usleep, what was the problem you used it in the end in your code. Could you comment on that a little further? Thank you in advance. – John Doerthy Dec 09 '15 at 14:06
  • I needed to process the responses - unlike you. More than that I can't remember other than uslapp allowed a fraction of a delay which seemed to work so I left it. – Professor Abronsius Dec 09 '15 at 14:19