Im trying to "get" text from another website, and publish it on mine, so that when the other website updates the text inside a "div" or other object, my website will update aswell.
Can this be done in php? And if so, how?
Im trying to "get" text from another website, and publish it on mine, so that when the other website updates the text inside a "div" or other object, my website will update aswell.
Can this be done in php? And if so, how?
php has inbuilt function file_get_contents to do this
$html=file_get_contents("http://www.website.com")
However this isn't particularly helpful and you can't set a timeout on the request, so heres a quick function using curl:
function getHTML($url,$timeout)
{
$gs = curl_init($url); // initialize curl with given url
curl_setopt($gs, CURLOPT_USERAGENT, $_SERVER["HTTP_USER_AGENT"]); // set useragent
curl_setopt($gs, CURLOPT_RETURNTRANSFER, true); // write the response to a variable
curl_setopt($gs, CURLOPT_FOLLOWLOCATION, true); // follow redirects
curl_setopt($gs, CURLOPT_CONNECTTIMEOUT, $timeout); // max. seconds
curl_setopt($gs, CURLOPT_FAILONERROR, 1); // stop if an error is encountered
return @curl_exec($gs);
}
Then you can just use a Regular Expression to get the data you want, e.g.
preg_match("/<title>(.*)<\/title>/i", $html, $match);
$pagetitle = $match[1];
EDIT:
In response to the comment below regarding Regex, I suggest you checkout the following Stack Overflow question and answer:
As the PHP Document Object Model may well be what you're looking for.
What about this:
<?php
function getHTMLData($url , $query){
$data = simplexml_load_file($url);
$result = $data->$query;
}
Remember HTML is from XML which is Parsed by browsers using there tags