Right now I load the facebook, twitter, google, pinterest, ... asynch by appending them into the DOM after the window load, like this:
<script type="text/javascript">
function sm_l() {
! function(d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (!d.getElementById(id)) {
js = d.createElement(s);
js.id = id;
js.src = "//platform.twitter.com/widgets.js";
js.async = true;
fjs.parentNode.insertBefore(js, fjs);
}
}(document, "script", "twitter-wjs");
window.___gcfg = {
lang: 'es'
};
(function() {
var po = document.createElement('script');
po.type = 'text/javascript';
po.async = true;
po.src = 'https://apis.google.com/js/plusone.js';
var s = document.getElementsByTagName('script')[0];
s.parentNode.insertBefore(po, s);
})();
(function(d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.async = true;
js.src = "//connect.facebook.net/es_ES/all.js#xfbml=1&appId=121495767915738";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
(function() {
var po = document.createElement('script');
po.type = 'text/javascript';
po.async = true;
po.src = '//assets.pinterest.com/js/pinit.js';
var s = document.getElementsByTagName('script')[0];
s.parentNode.insertBefore(po, s);
})();
$('#share_the_love').show();
$('.social_media').slideDown();
}
</script>
Thing is that it stills many requests that can't be cached with htaccess because those are external resources.
So I thought I'd cache them all in a single local .js file and use that one insted, like this:
<?php
function curl_file_get_contents($url) {
$curl = curl_init();
$userAgent = 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322)';
curl_setopt($s, CURLOPT_HEADER, true);
curl_setopt($curl,CURLOPT_URL,$url); //The URL to fetch. This can also be set when initializing a session with curl_init().
curl_setopt($curl,CURLOPT_RETURNTRANSFER,TRUE); //TRUE to return the transfer as a string of the return value of curl_exec() instead of
curl_setopt($curl,CURLOPT_CONNECTTIMEOUT,5); //The number of seconds to wait while trying to connect.
curl_setopt($curl, CURLOPT_USERAGENT, $userAgent); //The contents of the "User-Agent: " header to be used in a HTTP request.
curl_setopt($curl, CURLOPT_FOLLOWLOCATION, TRUE); //To follow any "Location: " header that the server sends as part of the HTTP header.
curl_setopt($curl, CURLOPT_TIMEOUT, 10); //The maximum number of seconds to allow cURL functions to execute.
curl_setopt($curl, CURLOPT_SSL_VERIFYPEER, 0); //To stop cURL from verifying the peer's certificate.
curl_setopt($curl, CURLOPT_SSL_VERIFYHOST, 0);
$contents = curl_exec($curl);
curl_close($curl);
return $contents;
}
$resoruces = array( 'https://apis.google.com/js/plusone.js', 'https://connect.facebook.net/es_LA/all.js#xfbml=1&appId=121495767915738', 'https://assets.pinterest.com/js/pinit.js', 'http://platform.twitter.com/widgets.js' );
foreach($resoruces as $resource) {
$result .= curl_file_get_contents($resource);
}
$result = file_put_contents('./js/cached/social.js', $result);
?>
Loading cached file instead:
<script type="text/javascript">
function sm_l() {
(function() {
var po = document.createElement('script');
po.type = 'text/javascript';
po.async = true;
po.src = '//funcook.com/js/cached/social.js';
var s = document.getElementsByTagName('script')[0];
s.parentNode.insertBefore(po, s);
})();
$('#share_the_love').show();
$('.social_media').slideDown();
}
</script>
Thing is that twitter buttons seem to work, but google+ and facebook are not,
The thing is that I am not getting errors in console, even getting the usual facebook log:
Test it here: http://jsfiddle.net/m4fms9u6/1/
Any idea what am I missing?
-EDIT-
Main reason: