I need to make the acquaintance of SOAP, and wrote a simple client connecting to some random web service. (Turns out even finding a working service is a bit of a hassle.)
The code I have so far seems to work - but here's the thing: it only works once every ten seconds.
When I first load the page it shows the result I expect - a var_dump of an object - but when I reload the page right after that, all I see is Error Fetching http headers
. Now matter how many times I refresh, it takes around ten seconds until I get the right result again, and then the process repeats - refresh too quickly, get an error.
I can't see what's going on at the HTTP level, and even if I could, I'm not sure I'd be able to draw the right conclusions.
Answers to similar questions posted here include setting the keep_alive
option to false
, or extending the default_socket_timeout
, but neither solution worked for me.
So, long story short: is this an issue on the service's end or a problem I can remedy, and if it's the latter, how?
Here's the code I got so far:
<?php
error_reporting(-1);
ini_set("display_errors", true);
ini_set("max_execution_time", 600);
ini_set('default_socket_timeout', 600);
$wsdl = "http://api.chartlyrics.com/apiv1.asmx?WSDL";
try
{
$client = new SoapClient($wsdl, array(
"keep_alive" => false,
"trace" => true
));
$response = $client->SearchLyricDirect(array(
"artist" => "beatles",
"song" => "norwegian wood"
));
var_dump($response);
}
catch (Exception $e)
{
echo $e->getMessage();
}
?>
Any help would be appreciated. (And as a bonus, if you could enlighten me as to why saving the WSDL locally speeds the process up by 30 seconds, that'd be great as well. I assume it's the DNS lookup that takes so much time?)