103

I wish to make a simple GET request to another script on a different server. How do I do this?

In one case, I just need to request an external script without the need for any output.

make_request('http://www.externalsite.com/script1.php?variable=45'); //example usage

In the second case, I need to get the text output.

$output = make_request('http://www.externalsite.com/script2.php?variable=45');
echo $output; //string output

To be honest, I do not want to mess around with CURL as this isn't really the job of CURL. I also do not want to make use of http_get as I do not have the PECL extensions.

Would fsockopen work? If so, how do I do this without reading in the contents of the file? Is there no other way?

Thanks all

Update

I should of added, in the first case, I do not want to wait for the script to return anything. As I understand file_get_contents() will wait for the page to load fully etc?

Peter O.
  • 32,158
  • 14
  • 82
  • 96
Abs
  • 56,052
  • 101
  • 275
  • 409

22 Answers22

49

file_get_contents will do what you want

$output = file_get_contents('http://www.example.com/');
echo $output;

Edit: One way to fire off a GET request and return immediately.

Quoted from http://petewarden.typepad.com/searchbrowser/2008/06/how-to-post-an.html

function curl_post_async($url, $params)
{
    foreach ($params as $key => &$val) {
      if (is_array($val)) $val = implode(',', $val);
        $post_params[] = $key.'='.urlencode($val);
    }
    $post_string = implode('&', $post_params);

    $parts=parse_url($url);

    $fp = fsockopen($parts['host'],
        isset($parts['port'])?$parts['port']:80,
        $errno, $errstr, 30);

    $out = "POST ".$parts['path']." HTTP/1.1\r\n";
    $out.= "Host: ".$parts['host']."\r\n";
    $out.= "Content-Type: application/x-www-form-urlencoded\r\n";
    $out.= "Content-Length: ".strlen($post_string)."\r\n";
    $out.= "Connection: Close\r\n\r\n";
    if (isset($post_string)) $out.= $post_string;

    fwrite($fp, $out);
    fclose($fp);
}

What this does is open a socket, fire off a get request, and immediately close the socket and return.

Marquis Wang
  • 10,878
  • 5
  • 30
  • 25
  • 6
    curl_post_async sends a POST request, not a GET. – Vinko Vrsalovic Jun 08 '09 at 02:36
  • 4
    Yeah, why would you want a POST request? In fact, I think a HEAD request would make the most sense. – Sasha Chedygov Jun 08 '09 at 02:47
  • Er yeah good point. Make that curl_get_async and use replace POST with GET. I also think that a HEAD request would make more sense, but this would be slightly faster? I think? – Marquis Wang Jun 08 '09 at 05:59
  • Changing the POST to GET alone didn't solve the problem. So I kept it as a POST and I finally have an immediate return of my script! :) – Abs Jun 08 '09 at 19:25
  • @Abs: My answer below shows how to use GET instead of POST to achieve the same results. – catgofire Oct 15 '10 at 19:33
  • This was a great start; I was able to clean it up and make it even faster and more flexible! https://gist.github.com/1898009 – Jay Taylor Feb 24 '12 at 05:28
  • 13
    Am I right in saying that this function is improperly named? It really doesn't have anything to do with the curl library. It's fsock_post_async() more like it – MikeMurko Oct 08 '12 at 19:28
  • 67
    This is NOT async! In particular if the server on the other side is down this piece of code will hang for 30 seconds (the 5th parameter in the fsockopen). Also the fwrite is going to take its sweet time to execute (that you can limit with stream_set_timeout($fp, $my_timeout). The best you can do is to set a low timeout on fsockopen to 0.1 (100ms) and $my_timeout to 100ms. You risk though, that the request timeout. – Chris Cinelli Oct 25 '12 at 00:53
  • 2
    One thing to note is if the URL requested is rewritten (ie server uses mod_rewrite), this approach will not work. You'll need to use something like curl instead. – blak3r Jan 06 '13 at 00:23
  • 4
    This has nothing to do with async. This is as sync as it gets... Async means do other tasks while this task is being completed. It's parallel execution. – CodeAngry Oct 28 '13 at 14:40
  • 20
    This is neither async nor is it using curl, how you dare calling it `curl_post_async` and get even upvotes... – Daniel W. Oct 31 '13 at 11:20
  • Real async could be achieved via multithreading: http://stackoverflow.com/questions/70855/how-can-one-use-multi-threading-in-php-applications – zelibobla Apr 04 '14 at 15:44
  • 1
    `exec("curl $url > /dev/null 2>&1 &");` is one of the fastest solutions around (Thanks @Matt Huggins). It's immensely faster (1.9s for 100 iterations) than the `curl_post_async()` function (14.8s). And it doesn't come with the same timeout/URL rewriting/etc limitations since it's full-blown cURL. AND it's a one-liner... – rinogo Jan 28 '16 at 20:33
  • 1
    This answer is really outdated and doesn't support HTTPS. – Simon East Jul 09 '18 at 06:07
  • In which world is file_get_contents async? This answer needs to be deleted. – Boy Sep 25 '20 at 09:42
  • This is not async. you can use swoole, ratchet or spatie async – Mohammad Salehi May 24 '21 at 11:37
32

This is how to make Marquis' answer work with both POST and GET requests:

  // $type must equal 'GET' or 'POST'
  function curl_request_async($url, $params, $type='POST')
  {
      foreach ($params as $key => &$val) {
        if (is_array($val)) $val = implode(',', $val);
        $post_params[] = $key.'='.urlencode($val);
      }
      $post_string = implode('&', $post_params);

      $parts=parse_url($url);

      $fp = fsockopen($parts['host'],
          isset($parts['port'])?$parts['port']:80,
          $errno, $errstr, 30);

      // Data goes in the path for a GET request
      if('GET' == $type) $parts['path'] .= '?'.$post_string;

      $out = "$type ".$parts['path']." HTTP/1.1\r\n";
      $out.= "Host: ".$parts['host']."\r\n";
      $out.= "Content-Type: application/x-www-form-urlencoded\r\n";
      $out.= "Content-Length: ".strlen($post_string)."\r\n";
      $out.= "Connection: Close\r\n\r\n";
      // Data goes in the request body for a POST request
      if ('POST' == $type && isset($post_string)) $out.= $post_string;

      fwrite($fp, $out);
      fclose($fp);
  }
catgofire
  • 603
  • 5
  • 6
  • 2
    This is a handy code snippet, and I've been using it here and there, but I now find that I need to do the same thing, but with an SSL site. Is there anything I need to change besides the HTTP/1.1 type and the port? – Kevin Jhangiani Apr 12 '11 at 21:59
  • can you be please be more specific on how to call this function . – pufos Feb 20 '12 at 15:34
  • 1
    "Is there anything I need to change besides the HTTP/1.1 type and the port?" - Yes, you should call fsockopen() with the hostname as `ssl://hostname` instead of just `hostname`. – Cowlby Aug 05 '12 at 06:14
  • 24
    This is NOT async! In particular if the server on the other side is down this piece of code will hang for 30 seconds (the 5th parameter in the fsockopen). Also the fwrite is going to take its sweet time to execute (that you can limit with stream_set_timeout($fp, $my_timeout). The best you can do is to set a low timeout on fsockopen to 0.1 (100ms) and $my_timeout to 100ms. You risk though, that the request timeout. – Chris Cinelli Oct 25 '12 at 00:53
  • 2
    In Response to question about using this for SSL you can make it SSL by changing the port to 443 and appending ssl:// to the port name in fsockopen: $fp = fsockopen("ssl://".$parts['host'], – Michael Dogger Jul 06 '11 at 03:55
  • I tried this but it only works when I step through the code in debugger PHP storm :/ is it possible that the original request to my server gets killed before it could POST to another server? – davidhq Aug 27 '14 at 17:06
  • 1
    Content-Length should not be set for GET. Maybe in some scenarios doesn't cause error but in my case resulted in request not being processed by called php script. – user3285954 Jun 20 '15 at 22:35
15

Regarding your update, about not wanting to wait for the full page to load - I think a HTTP HEAD request is what you're looking for..

get_headers should do this - I think it only requests the headers, so will not be sent the full page content.

"PHP / Curl: HEAD Request takes a long time on some sites" describes how to do a HEAD request using PHP/Curl

If you want to trigger the request, and not hold up the script at all, there are a few ways, of varying complexities..

  • Execute the HTTP request as a background process, php execute a background process - basically you would execute something like "wget -O /dev/null $carefully_escaped_url" - this will be platform specific, and you have to be really careful about escaping parameters to the command
  • Executing a PHP script in the background - basically the same as the UNIX process method, but executing a PHP script rather than a shell command
  • Have a "job queue", using a database (or something like beanstalkd which is likely overkill). You add a URL to the queue, and a background process or cron-job routinely checks for new jobs and performs requests on the URL
Community
  • 1
  • 1
dbr
  • 165,801
  • 69
  • 278
  • 343
  • +1 for various interesting options that I've not thought of before – Jasdeep Khalsa Dec 08 '12 at 08:09
  • "I think it only requests the headers" - Perhaps, but there is nothing to stop a document from sending a full response body in response to a HEAD request. And i assume this method would use fsock under the hood and force it to wait for (and read) the full response. – hiburn8 Oct 18 '19 at 14:32
6

I would recommend you well tested PHP library: curl-easy

<?php
$request = new cURL\Request('http://www.externalsite.com/script2.php?variable=45');
$request->getOptions()
    ->set(CURLOPT_TIMEOUT, 5)
    ->set(CURLOPT_RETURNTRANSFER, true);

// add callback when the request will be completed
$request->addListener('complete', function (cURL\Event $event) {
    $response = $event->response;
    $content = $response->getContent();
    echo $content;
});

while ($request->socketPerform()) {
    // do anything else when the request is processed
}
stil
  • 5,306
  • 3
  • 38
  • 44
  • The [Guzzle PHP library](http://docs.guzzlephp.org/en/stable/quickstart.html#concurrent-requests) also has support for doing concurrent and asynchronous requests. – Simon East Jul 09 '18 at 06:09
  • 1
    Guzzle claims that it has support, but testing its postAsync method looks like it does 150 ms synchronously, and then 2 ms asynchronously. I've spent more than an hour trying to fix it without success - wouldn't recommend it. – Velizar Hristov Nov 23 '18 at 12:39
6

You don't. While PHP offers lots of ways to call a URL, it doesn't offer out of the box support for doing any kind of asynchronous/threaded processing per request/execution cycle. Any method of sending a request for a URL (or a SQL statement, or a etc.) is going to wait for some kind of response. You'll need some kind of secondary system running on the local machine to achieve this (google around for "php job queue")

Alana Storm
  • 164,128
  • 91
  • 395
  • 599
  • 1
    There is a hack here: http://stackoverflow.com/questions/124462/asynchronous-php-calls (answer by Christian Davén) but I agree that a queue would be the right way to do it. – Chris Cinelli Oct 25 '12 at 00:57
  • 1
    I think this answer from 2009 is now outdated. The [Guzzle PHP library](http://docs.guzzlephp.org/en/stable/quickstart.html#concurrent-requests) now has support for doing concurrent and asynchronous requests. – Simon East Jul 09 '18 at 06:08
5
function make_request($url, $waitResult=true){
    $cmi = curl_multi_init();

    $curl = curl_init();
    curl_setopt($curl, CURLOPT_URL, $url);
    curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);

    curl_multi_add_handle($cmi, $curl);

    $running = null;
    do {
        curl_multi_exec($cmi, $running);
        sleep(.1);
        if(!$waitResult)
        break;
    } while ($running > 0);
    curl_multi_remove_handle($cmi, $curl);
    if($waitResult){
        $curlInfos = curl_getinfo($curl);
        if((int) $curlInfos['http_code'] == 200){
            curl_multi_close($cmi);
            return curl_multi_getcontent($curl);
        }
    }
    curl_multi_close($cmi);
}
Serlite
  • 12,130
  • 5
  • 38
  • 49
amez
  • 91
  • 1
  • 1
  • 1
    You could make it return an object which lets you call `getstatus()` or `waitSend()` or `waitResult()`. That way, the caller can get fully async behavior by calling in a loop to check if there are results and, if not, continue onto whatever other task is running. Hmm, now I want to port `Task` from .net to php… – binki Oct 14 '16 at 16:18
4

If you are using Linux environment then you can use the PHP's exec command to invoke the linux curl. Here is a sample code, which will make a Asynchronous HTTP post.

function _async_http_post($url, $json_string) {
  $run = "curl -X POST -H 'Content-Type: application/json'";
  $run.= " -d '" .$json_string. "' " . "'" . $url . "'";
  $run.= " > /dev/null 2>&1 &";
  exec($run, $output, $exit);
  return $exit == 0;
}

This code does not need any extra PHP libs and it can complete the http post in less than 10 milliseconds.

Stranger
  • 49
  • 2
  • 1
    this is a very bad idea: exec fails a lot: imagine that 6/200 clients will not get their email confirmation for a payed booking ... – HellBaby Jul 17 '15 at 17:42
  • This worked for me, as far as I just need a ping to start another script on another server. I just used it like that : _async_http_post($url,''); And this is working on OVH mutualised servers... Which is great. – Kilowog Jun 03 '19 at 12:52
3

Nobody seems to mention Guzzle, which is a PHP HTTP client that makes it easy to send HTTP requests. It can work with or without Curl. It can send both synchronous and asynchronous requests.

$client = new GuzzleHttp\Client();
$promise = $client->requestAsync('GET', 'http://httpbin.org/get');
$promise->then(
    function (ResponseInterface $res) {
        echo $res->getStatusCode() . "\n";
    },
    function (RequestException $e) {
        echo $e->getMessage() . "\n";
        echo $e->getRequest()->getMethod();
    }
);
zstate
  • 1,995
  • 1
  • 18
  • 20
  • Yes, many of the answers in this thread are quite old, but Guzzle is definitely the best option I've come across in 2018, thanks for posting. – Simon East Jul 09 '18 at 06:13
3

Interesting problem. I'm guessing you just want to trigger some process or action on the other server, but don't care what the results are and want your script to continue. There is probably something in cURL that can make this happen, but you may want to consider using exec() to run another script on the server that does the call if cURL can't do it. (Typically people want the results of the script call so I'm not sure if PHP has the ability to just trigger the process.) With exec() you could run a wget or even another PHP script that makes the request with file_get_conents().

Darryl Hein
  • 142,451
  • 95
  • 218
  • 261
2

For me the question about asynchronous GET request is appeared because of I met with situation when I need to do hundreds of requests, get and deal with result data on every request and every request takes significant milliseconds of executing that leads to minutes(!) of total executing with simple file_get_contents.

In this case it was very helpful comment of w_haigh at php.net on function http://php.net/manual/en/function.curl-multi-init.php

So, here is my upgraded and cleaned version of making lot of requests simultaneously. For my case it's equivalent to "asynchronous" way. May be it helps for someone!

// Build the multi-curl handle, adding both $ch
$mh = curl_multi_init();

// Build the individual requests, but do not execute them
$chs = [];
$chs['ID0001'] = curl_init('http://webservice.example.com/?method=say&word=Hello');
$chs['ID0002'] = curl_init('http://webservice.example.com/?method=say&word=World');
// $chs[] = ...
foreach ($chs as $ch) {
    curl_setopt_array($ch, [
        CURLOPT_RETURNTRANSFER => true,  // Return requested content as string
        CURLOPT_HEADER => false,         // Don't save returned headers to result
        CURLOPT_CONNECTTIMEOUT => 10,    // Max seconds wait for connect
        CURLOPT_TIMEOUT => 20,           // Max seconds on all of request
        CURLOPT_USERAGENT => 'Robot YetAnotherRobo 1.0',
    ]);

    // Well, with a little more of code you can use POST queries too
    // Also, useful options above can be  CURLOPT_SSL_VERIFYHOST => 0  
    // and  CURLOPT_SSL_VERIFYPEER => false ...

    // Add every $ch to the multi-curl handle
    curl_multi_add_handle($mh, $ch);
}

// Execute all of queries simultaneously, and continue when ALL OF THEM are complete
$running = null;
do {
    curl_multi_exec($mh, $running);
} while ($running);

// Close the handles
foreach ($chs as $ch) {
    curl_multi_remove_handle($mh, $ch);
}
curl_multi_close($mh);

// All of our requests are done, we can now access the results
// With a help of ids we can understand what response was given
// on every concrete our request
$responses = [];
foreach ($chs as $id => $ch) {
    $responses[$id] = curl_multi_getcontent($ch);
    curl_close($ch);
}
unset($chs); // Finita, no more need any curls :-)

print_r($responses); // output results

It's easy to rewrite this to handle POST or other types of HTTP(S) requests or any combinations of them. And Cookie support, redirects, http-auth, etc.

FlameStorm
  • 944
  • 15
  • 20
  • 1
    Ohh.. I see the question created on 2009, and I write my answer in 2016 :) But lot of us google *php get asynchronous* and came here. – FlameStorm Apr 21 '16 at 22:03
  • Yes I also came here when Googling. Some coders might also want to look into the [Guzzle PHP library](http://docs.guzzlephp.org/en/stable/quickstart.html#concurrent-requests) which has support for doing concurrent and asynchronous requests. – Simon East Jul 09 '18 at 06:10
2

You'd better consider using Message Queues instead of advised methods. I'm sure this will be better solution, although it requires a little more job than just sending a request.

mra214
  • 509
  • 1
  • 7
  • 18
2

let me show you my way :)

needs nodejs installed on the server

(my server sends 1000 https get request takes only 2 seconds)

url.php :

<?
$urls = array_fill(0, 100, 'http://google.com/blank.html');

function execinbackground($cmd) { 
    if (substr(php_uname(), 0, 7) == "Windows"){ 
        pclose(popen("start /B ". $cmd, "r"));  
    } 
    else { 
        exec($cmd . " > /dev/null &");   
    } 
} 
fwite(fopen("urls.txt","w"),implode("\n",$urls);
execinbackground("nodejs urlscript.js urls.txt");
// { do your work while get requests being executed.. }
?>

urlscript.js >

var https = require('https');
var url = require('url');
var http = require('http');
var fs = require('fs');
var dosya = process.argv[2];
var logdosya = 'log.txt';
var count=0;
http.globalAgent.maxSockets = 300;
https.globalAgent.maxSockets = 300;

setTimeout(timeout,100000); // maximum execution time (in ms)

function trim(string) {
    return string.replace(/^\s*|\s*$/g, '')
}

fs.readFile(process.argv[2], 'utf8', function (err, data) {
    if (err) {
        throw err;
    }
    parcala(data);
});

function parcala(data) {
    var data = data.split("\n");
    count=''+data.length+'-'+data[1];
    data.forEach(function (d) {
        req(trim(d));
    });
    /*
    fs.unlink(dosya, function d() {
        console.log('<%s> file deleted', dosya);
    });
    */
}


function req(link) {
    var linkinfo = url.parse(link);
    if (linkinfo.protocol == 'https:') {
        var options = {
        host: linkinfo.host,
        port: 443,
        path: linkinfo.path,
        method: 'GET'
    };
https.get(options, function(res) {res.on('data', function(d) {});}).on('error', function(e) {console.error(e);});
    } else {
    var options = {
        host: linkinfo.host,
        port: 80,
        path: linkinfo.path,
        method: 'GET'
    };        
http.get(options, function(res) {res.on('data', function(d) {});}).on('error', function(e) {console.error(e);});
    }
}


process.on('exit', onExit);

function onExit() {
    log();
}

function timeout()
{
console.log("i am too far gone");process.exit();
}

function log() 
{
    var fd = fs.openSync(logdosya, 'a+');
    fs.writeSync(fd, dosya + '-'+count+'\n');
    fs.closeSync(fd);
}
user1031143
  • 828
  • 1
  • 10
  • 16
1

Here's an adaptation of the accepted answer for performing a simple GET request.

One thing to note if the server does any url rewriting, this will not work. You'll need to use a more full featured http client.

  /**
   * Performs an async get request (doesn't wait for response)
   * Note: One limitation of this approach is it will not work if server does any URL rewriting
   */
  function async_get($url)
  {
      $parts=parse_url($url);

      $fp = fsockopen($parts['host'],
          isset($parts['port'])?$parts['port']:80,
          $errno, $errstr, 30);

      $out = "GET ".$parts['path']." HTTP/1.1\r\n";
      $out.= "Host: ".$parts['host']."\r\n";
      $out.= "Connection: Close\r\n\r\n";
      fwrite($fp, $out);
      fclose($fp);
  }
blak3r
  • 16,066
  • 16
  • 78
  • 98
1

Just a few corrections on scripts posted above. The following is working for me

function curl_request_async($url, $params, $type='GET')
    {
        $post_params = array();
        foreach ($params as $key => &$val) {
            if (is_array($val)) $val = implode(',', $val);
            $post_params[] = $key.'='.urlencode($val);
        }
        $post_string = implode('&', $post_params);

        $parts=parse_url($url);
        echo print_r($parts, TRUE);
        $fp = fsockopen($parts['host'],
            (isset($parts['scheme']) && $parts['scheme'] == 'https')? 443 : 80,
            $errno, $errstr, 30);

        $out = "$type ".$parts['path'] . (isset($parts['query']) ? '?'.$parts['query'] : '') ." HTTP/1.1\r\n";
        $out.= "Host: ".$parts['host']."\r\n";
        $out.= "Content-Type: application/x-www-form-urlencoded\r\n";
        $out.= "Content-Length: ".strlen($post_string)."\r\n";
        $out.= "Connection: Close\r\n\r\n";
        // Data goes in the request body for a POST request
        if ('POST' == $type && isset($post_string)) $out.= $post_string;
        fwrite($fp, $out);
        fclose($fp);
    }
A23
  • 1,596
  • 2
  • 15
  • 31
  • Im having a problem, where the fwrite returns a positive number of bytes, but the script endpoint is not called ( is not logging).. it only works when i use : while (!feof($fp)) { fgets($fp, 128); } – Miguel Dec 17 '15 at 11:11
1

Try:

//Your Code here
$pid = pcntl_fork();
if ($pid == -1) {
     die('could not fork');
}
else if ($pid)
{
echo("Bye")  
}
else
{
     //Do Post Processing
}

This will NOT work as an apache module, you need to be using CGI.

LM.
  • 1,625
  • 3
  • 16
  • 23
1

I found this interesting link to do asynchronous processing(get request).

askapache

Furthermore you could do asynchronous processing by using a message queue like for instance beanstalkd.

Alfred
  • 60,935
  • 33
  • 147
  • 186
0

Based on this thread I made this for my codeigniter project. It works just fine. You can have any function processed in the background.

A controller that accepts the async calls.

class Daemon extends CI_Controller
{
    // Remember to disable CI's csrf-checks for this controller

    function index( )
    {
        ignore_user_abort( 1 );
        try
        {
            if ( strcmp( $_SERVER['REMOTE_ADDR'], $_SERVER['SERVER_ADDR'] ) != 0 && !in_array( $_SERVER['REMOTE_ADDR'], $this->config->item( 'proxy_ips' ) ) )
            {
                log_message( "error", "Daemon called from untrusted IP-address: " . $_SERVER['REMOTE_ADDR'] );
                show_404( '/daemon' );
                return;
            }

            $this->load->library( 'encrypt' );
            $params = unserialize( urldecode( $this->encrypt->decode( $_POST['data'] ) ) );
            unset( $_POST );
            $model = array_shift( $params );
            $method = array_shift( $params );
            $this->load->model( $model );
            if ( call_user_func_array( array( $this->$model, $method ), $params ) === FALSE )
            {
                log_message( "error", "Daemon could not call: " . $model . "::" . $method . "()" );
            }
        }
        catch(Exception $e)
        {
            log_message( "error", "Daemon has error: " . $e->getMessage( ) . $e->getFile( ) . $e->getLine( ) );
        }
    }
}

And a library that does the async calls

class Daemon
{
    public function execute_background( /* model, method, params */ )
    {
        $ci = &get_instance( );
        // The callback URL (its ourselves)
        $parts = parse_url( $ci->config->item( 'base_url' ) . "/daemon" );
        if ( strcmp( $parts['scheme'], 'https' ) == 0 )
        {
            $port = 443;
            $host = "ssl://" . $parts['host'];
        }
        else 
        {
            $port = 80;
            $host = $parts['host'];
        }
        if ( ( $fp = fsockopen( $host, isset( $parts['port'] ) ? $parts['port'] : $port, $errno, $errstr, 30 ) ) === FALSE )
        {
            throw new Exception( "Internal server error: background process could not be started" );
        }
        $ci->load->library( 'encrypt' );
        $post_string = "data=" . urlencode( $ci->encrypt->encode( serialize( func_get_args( ) ) ) );
        $out = "POST " . $parts['path'] . " HTTP/1.1\r\n";
        $out .= "Host: " . $host . "\r\n";
        $out .= "Content-Type: application/x-www-form-urlencoded\r\n";
        $out .= "Content-Length: " . strlen( $post_string ) . "\r\n";
        $out .= "Connection: Close\r\n\r\n";
        $out .= $post_string;
        fwrite( $fp, $out );
        fclose( $fp );
    }
}

This method can be called to process any model::method() in the 'background'. It uses variable arguments.

$this->load->library('daemon');
$this->daemon->execute_background( 'model', 'method', $arg1, $arg2, ... );
Patrick Savalle
  • 4,068
  • 3
  • 22
  • 24
0

Suggestion: format a FRAMESET HTML page which contains, let´s say, 9 frames inside. Each frame will GET a different "instance" of your myapp.php page. There will be 9 different threads running on the Web server, in parallel.

0

For PHP5.5+, mpyw/co is the ultimate solution. It works as if it is tj/co in JavaScript.

Example

Assume that you want to download specified multiple GitHub users' avatars. The following steps are required for each user.

  1. Get content of http://github.com/mpyw (GET HTML)
  2. Find <img class="avatar" src="..."> and request it (GET IMAGE)

---: Waiting my response
...: Waiting other response in parallel flows

Many famous curl_multi based scripts already provide us the following flows.

        /-----------GET HTML\  /--GET IMAGE.........\
       /                     \/                      \ 
[Start] GET HTML..............----------------GET IMAGE [Finish]
       \                     /\                      /
        \-----GET HTML....../  \-----GET IMAGE....../

However, this is not efficient enough. Do you want to reduce worthless waiting times ...?

        /-----------GET HTML--GET IMAGE\
       /                                \            
[Start] GET HTML----------------GET IMAGE [Finish]
       \                                /
        \-----GET HTML-----GET IMAGE.../

Yes, it's very easy with mpyw/co. For more details, visit the repository page.

mpyw
  • 5,526
  • 4
  • 30
  • 36
-1

Here is my own PHP function when I do POST to a specific URL of any page....

Sample: * usage of my Function...

<?php
    parse_str("email=myemail@ehehehahaha.com&subject=this is just a test");
    $_POST['email']=$email;
    $_POST['subject']=$subject;
    echo HTTP_Post("http://example.com/mail.php",$_POST);***

    exit;
?>
<?php
    /*********HTTP POST using FSOCKOPEN **************/
    // by ArbZ

    function HTTP_Post($URL,$data, $referrer="") {

    // parsing the given URL
    $URL_Info=parse_url($URL);

    // Building referrer
    if($referrer=="") // if not given use this script as referrer
      $referrer=$_SERVER["SCRIPT_URI"];

    // making string from $data
    foreach($data as $key=>$value)
      $values[]="$key=".urlencode($value);
    $data_string=implode("&",$values);

    // Find out which port is needed - if not given use standard (=80)
    if(!isset($URL_Info["port"]))
      $URL_Info["port"]=80;

    // building POST-request: HTTP_HEADERs
    $request.="POST ".$URL_Info["path"]." HTTP/1.1\n";
    $request.="Host: ".$URL_Info["host"]."\n";
    $request.="Referer: $referer\n";
    $request.="Content-type: application/x-www-form-urlencoded\n";
    $request.="Content-length: ".strlen($data_string)."\n";
    $request.="Connection: close\n";
    $request.="\n";
    $request.=$data_string."\n";

    $fp = fsockopen($URL_Info["host"],$URL_Info["port"]);
    fputs($fp, $request);
    while(!feof($fp)) {
        $result .= fgets($fp, 128);
    }
    fclose($fp); //$eco = nl2br();

    function getTextBetweenTags($string, $tagname) {
        $pattern = "/<$tagname ?.*>(.*)<\/$tagname>/";
        preg_match($pattern, $string, $matches);
        return $matches[1]; }
    //STORE THE FETCHED CONTENTS to a VARIABLE, because its way better and fast...
    $str = $result;
    $txt = getTextBetweenTags($str, "span"); $eco = $txt;  $result = explode("&",$result);
    return $result[1];
<span style=background-color:LightYellow;color:blue>".trim($_GET['em'])."</span>
</pre> "; 
}
</pre>
Adam Zuckerman
  • 1,633
  • 1
  • 14
  • 20
i am ArbZ
  • 69
  • 3
-2

Try this code....

$chu = curl_init();

curl_setopt($chu, CURLOPT_URL, 'http://www.myapp.com/test.php?someprm=xyz');

curl_setopt($chu, CURLOPT_FRESH_CONNECT, true);
curl_setopt($chu, CURLOPT_TIMEOUT, 1);

curl_exec($chu);
curl_close($chu);

Please dont forget to enable CURL php extension.

Taryn
  • 242,637
  • 56
  • 362
  • 405
Mukesh
  • 69
  • 7
  • You can set `CURLOPT_TIMEOUT_MS` e.g. 100 miliseconds instead of `CURLOPT_TIMEOUT` which is in seconds and has a min of 1 second - for faster execution. – Jason Silver Jun 05 '17 at 00:04
-5

This works fine for me, sadly you cannot retrieve the response from your request:

<?php
header("http://mahwebsite.net/myapp.php?var=dsafs");
?>

It works very fast, no need for raw tcp sockets :)

D4zk1tty
  • 123
  • 1
  • 3
  • 15