1
<?php
function getTitle($Url){
    $str = file_get_contents($Url);
    if(strlen($str)>0){
        preg_match("/\<title\>(.*)\<\/title\>/",$str,$title);
        return $title[1];
    }
}
echo getTitle("http://www.stackoverflow.com/");
?>

When i run this script it returns the error

max_execution_time exceeds 30 second.

I don't want to increase the max_execution_time but i want to decrease the script run time.

  • i know that file_get_contents(); is optimized for local files, not external ones; in combination with a really slow server this could be slow. Try [cURL](http://coderscult.com/php/php-curl/2008/05/20/php-curl-tutorial-and-example/). By the way i just called the same code from localhost and read it in 2,3 secs so no problem except for stackoverflow slowtimes (sometimes) –  May 25 '12 at 11:16
  • you might want to check how much time each part of your script needs, the file_get_contents or the strlen() part – periklis May 25 '12 at 11:19
  • duplicate of [this](http://stackoverflow.com/questions/1378915/header-only-retrieval-in-php-via-curl) –  May 25 '12 at 11:21

1 Answers1

0

You're collecting the whole page. Why not just specify the $maxlen param and just get more or less the header of the page where the title tag is in? This would give you a little speedup.

string file_get_contents ( string $filename [, bool $use_include_path = false [, resource $context [, int $offset = -1 [, int $maxlen ]]]] )

pdu
  • 10,295
  • 4
  • 58
  • 95
  • Iam making this to get title of any page on internet therefore i have no idea of $maxlen –  May 25 '12 at 11:18