75

I am working with a script (that I did not create originally) that generates a pdf file from an HTML page. The problem is that it is now taking a very long time, like 1-2 minutes, to process. Supposedly this was working fine originally, but has slowed down within the past couple of weeks.

The script calls file_get_contents on a php script, which then outputs the result into an HTML file on the server, and runs the pdf generator app on that file.

I seem to have narrowed down the problem to the file_get_contents call on a full url, rather than a local path.

When I use

$content = file_get_contents('test.txt');

it processes almost instantaneously. However, if I use the full url

$content = file_get_contents('http://example.com/test.txt');

it takes anywhere from 30-90 seconds to process.

It's not limited to our server, it is slow when accessing any external url, such as http://www.google.com. I believe the script calls the full url because there are query string variables that are necessary that don't work if you call the file locally.

I also tried fopen, readfile, and curl, and they were all similarly slow. Any ideas on where to look to fix this?

rdonatoiop
  • 1,185
  • 1
  • 14
  • 28
ecurbh
  • 761
  • 1
  • 6
  • 3

9 Answers9

190

Note: This has been fixed in PHP 5.6.14. A Connection: close header will now automatically be sent even for HTTP/1.0 requests. See commit 4b1dff6.

I had a hard time figuring out the cause of the slowness of file_get_contents scripts.

By analyzing it with Wireshark, the issue (in my case and probably yours too) was that the remote web server DIDN'T CLOSE THE TCP CONNECTION UNTIL 15 SECONDS (i.e. "keep-alive").

Indeed, file_get_contents doesn't send a "connection" HTTP header, so the remote web server considers by default that's it's a keep-alive connection and doesn't close the TCP stream until 15 seconds (It might not be a standard value - depends on the server conf).

A normal browser would consider the page is fully loaded if the HTTP payload length reaches the length specified in the response Content-Length HTTP header. File_get_contents doesn't do this and that's a shame.

SOLUTION

SO, if you want to know the solution, here it is:

$context = stream_context_create(array('http' => array('header'=>'Connection: close\r\n')));
file_get_contents("http://www.something.com/somepage.html",false,$context);

The thing is just to tell the remote web server to close the connection when the download is complete, as file_get_contents isn't intelligent enough to do it by itself using the response Content-Length HTTP header.

kelunik
  • 6,750
  • 2
  • 41
  • 70
KrisWebDev
  • 1,909
  • 1
  • 11
  • 2
  • can this be done from the somepage.html side? (if somepage.html is a php script which can output headers) I tried header('Connection: close'); but it did not work – Ray S. Feb 10 '14 at 11:58
  • I'd like to note that this issue is fixed with PHP 5.6 (http://lxr.php.net/xref/PHP_5_6/ext/standard/http_fopen_wrapper.c#579); so this is not necessary anymore. – bwoebi Jul 08 '15 at 22:16
  • without your solution: **63.4555 seconds** with your solution: **24.6947 seconds** – Elyor Nov 22 '16 at 04:41
  • 2
    I've seen a very similar issue even with PHP7 flavors... maybe it's not really fixed. I'm switching to curl. – Jonny Jun 10 '20 at 01:49
  • 1
    Experiencing the same issue with PHP 7.4.4 and 7.4.8. Using the `Connect: close` trick above or cURL fixes it. (Went from 22 seconds down to under a second.) – Fabien Snauwaert Aug 22 '20 at 10:17
  • Made a massive difference foe me in PHP 7.4 even using the absolute path on webserver, which was taking up to a minute to load a 55MB XML file which now seems to manage less than four seconds. – Steve May 10 '21 at 22:59
  • It also works with PHP `copy()`. Thanks a thousand. `copy($url, $dest, $context);` – Avatar Dec 09 '21 at 15:14
45

I would use curl() to fetch external content, as this is much quicker than the file_get_contents method. Not sure if this will solve the issue, but worth a shot.

Also note that your servers speed will effect the time it takes to retrieve the file.

Here is an example of usage:

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://example.com/test.txt');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);
Mobiletainment
  • 22,201
  • 9
  • 82
  • 98
Jim
  • 18,673
  • 5
  • 49
  • 65
  • can you please link to a benchmark about comparing file_get_contents and curl speeds ? – shamittomar Sep 02 '10 at 18:09
  • @shamittomar, the benchmarks vary,but a simple google can come up with a bunch of different results. http://stackoverflow.com/questions/555523/file-get-contents-vs-curl-what-has-better-performance is one of them. I just know that cURL is faster from various applications I have used in the past. So that is just from personal experience and it makes sense as cURL was developed for the sole reason of fetching remote files. Where as file_get_contents / fopen were developed for generally reading local files. – Jim Sep 02 '10 at 18:51
  • One advantage of curl is that it will re-use an existing connection (when using the same handle), which is important, if you are doing multiple requests to a single host (e.g. API calls). – blueyed Jun 05 '13 at 17:02
  • I was performing an GET on an API test route (no DB connections or filesystem interaction), and simply switching from `file_get_contents` to `curl` took the response time down from ~500ms to ~100ms – Kristian Aug 12 '13 at 19:52
8

Sometimes, it's because the DNS is too slow on your server, try this:

replace

echo file_get_contents('http://www.google.com');

as

$context=stream_context_create(array('http' => array('header'=>"Host: www.google.com\r\n")));
echo file_get_contents('http://74.125.71.103', false, $context);
diyism
  • 12,477
  • 5
  • 46
  • 46
  • This was the issue in my case. Two DNS servers were configured in `/etc/resolv.conf`, but the first server was unreachable. DNS lookups timed out on the first server, then jumped to the second DNS server several seconds later. – thirdender May 22 '14 at 01:33
  • Or simply replace `$result = file_get_contents('http://google.com', false, $context);` with `$ip = gethostbyname('google.com');` `$result = file_get_contents("http://$ip", false, $context);` – Ronald Bijker Sep 22 '16 at 09:54
3

I had the same issue,

The only thing that worked for me is setting timeout in $options array.

$options = array(
    'http' => array(
        'header'  => implode($headers, "\r\n"),
        'method'  => 'POST',
        'content' => '',
        'timeout' => .5
    ),
);
Walid Ammar
  • 4,038
  • 3
  • 25
  • 48
  • it is the time out but I have no idea why. My best guess is there is IPv6 stupidity on OS X that you cannot turn off. Curl works fine, but file_get_contents will take over 60 seconds depending on timeout. NOTE: IPv6 is disabled on the public interface for this crapintosh, you cannot disable IPv6 globally or on the loopback. – Alex Barker Sep 20 '16 at 23:09
3
$context = stream_context_create(array('http' => array('header'=>'Connection: close\r\n')));
$string = file_get_contents("http://localhost/testcall/request.php",false,$context);

Time: 50976 ms (avaerage time in total 5 attempts)

$ch = curl_init();
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, "http://localhost/testcall/request.php");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
echo $data = curl_exec($ch);
curl_close($ch);

Time: 46679 ms (avaerage time in total 5 attempts)

Note: request.php is used to fetch some data from mysql database.

Amito
  • 61
  • 4
1

Can you try fetching that url, on the server, from the command line? curl or wget come to mind. If those retrieve the URL at a normal speed, then it's not a network problem and most likely something in the apache/php setup.

Marc B
  • 356,200
  • 43
  • 426
  • 500
  • 1
    When I try wget from the command line, that is also very slow. It is hanging at the resolving... step. Some kind of DNS problem on the server? – ecurbh Sep 02 '10 at 17:45
  • Could be. Try using 'host' or 'nslookup' (whatever's available) and try to resolve various different hostnames from the system. – Marc B Sep 02 '10 at 19:01
1

I have a huge data passed by API, I'm using file_get_contents to read the data, but it took around 60 seconds. However, using KrisWebDev's solution it took around 25 seconds.

$context = stream_context_create(array('https' => array('header'=>'Connection: close\r\n')));
file_get_contents($url,false,$context);
Elyor
  • 5,396
  • 8
  • 48
  • 76
0

What I would also consider with Curl is that you can "thread" the requests. This has helped me immensely as I do not have access to a version of PHP that allows threading at the moment .

For example, I was getting 7 images from a remote server using file_get_contents and it was taking 2-5 seconds per request. This process alone was adding 30seconds or something to the process, while the user waited for the PDF to be generated.

This literally reduced the time to about 1 image. Another example, I verify 36 urls in the time it took before to do one. I think you get the point. :-)

    $timeout = 30;
    $retTxfr = 1;
    $user = '';
    $pass = '';

    $master = curl_multi_init();
    $node_count = count($curlList);
    $keys = array("url");

    for ($i = 0; $i < $node_count; $i++) {
        foreach ($keys as $key) {
            if (empty($curlList[$i][$key])) continue;
            $ch[$i][$key] = curl_init($curlList[$i][$key]);
            curl_setopt($ch[$i][$key], CURLOPT_TIMEOUT, $timeout); // -- timeout after X seconds
            curl_setopt($ch[$i][$key], CURLOPT_RETURNTRANSFER, $retTxfr);
            curl_setopt($ch[$i][$key], CURLOPT_HTTPAUTH, CURLAUTH_ANY);
            curl_setopt($ch[$i][$key], CURLOPT_USERPWD, "{$user}:{$pass}");
            curl_setopt($ch[$i][$key], CURLOPT_RETURNTRANSFER, true);
            curl_multi_add_handle($master, $ch[$i][$key]);
        }
    }

    // -- get all requests at once, finish when done or timeout met --
    do {  curl_multi_exec($master, $running);  }
    while ($running > 0);

Then check over the results:

            if ((int)curl_getinfo($ch[$i][$key], CURLINFO_HTTP_CODE) > 399 || empty($results[$i][$key])) {
                unset($results[$i][$key]);
            } else {
                $results[$i]["options"] = $curlList[$i]["options"];
            }
            curl_multi_remove_handle($master, $ch[$i][$key]);
            curl_close($ch[$i][$key]);

then close file:

    curl_multi_close($master);
Mike Q
  • 6,716
  • 5
  • 55
  • 62
0

I know that is old question but I found it today and answers didn't work for me. I didn't see anyone saying that max connections per IP may be set to 1. That way you are doing API request and API is doing another request because you use full url. That's why loading directly from disc works. For me that fixed a problem:

if (strpos($file->url, env('APP_URL')) === 0) {
    $url = substr($file->url, strlen(env('APP_URL')));
} else {
    $url = $file->url;
}
return file_get_contents($url);
ElChupacabra
  • 1,045
  • 10
  • 18