2

We have some links we want to hide from Google, using Javascript to "hide" the link but let it work for the real clients.

I was told from the SEO agency that the best method is to base64 encode the link and call it via javascript:

<a data-href="RdcDovL1N0YWdpbmc...base64...hhcmRpbmctaGVycmVuLWhlbaQtMTgyMDg3"
   href="#">Link</a>


<script>
<!--
var _dlist = document.getElementsByTagName('A');
for(i=0;i<_dlist.length;i++) {
    var _data = _dlist[i].getAttribute( 'data-href' );
    if( _data !== 'null' ) {
        var _extend = CryptoJS.enc.Base64.parse( _data );
        _dlist[i].setAttribute( 'href', _extend.toString( CryptoJS.enc.Latin1 ) );
    }
}
-->
</script> 

My problem now is, I don't want to include another 2 files (they suggested me crypto-js lib) just for the links. I'd like to ask you, how far does Google reveal links and follow them and what's the easiest approach without loading more scripts. jQuery is available.

Daniel W.
  • 31,164
  • 13
  • 93
  • 151
  • 2
    why don't you just use `rel="nofollow"` on your links? – FabioG Feb 17 '14 at 10:43
  • @FabioG Using `rel="nofollow"` you are losing linkjuice; the other links lose SEO power that way - nofollow does not work. – Daniel W. Feb 17 '14 at 11:03
  • @FabioG check this article: http://www.seomofo.com/marketing/seo-for-affiliate-links.html – Daniel W. Feb 17 '14 at 11:34
  • got your point, i wasn't aware of that. In that case why don't you add those links with JS in a `window.onload` function instead of having them on the HTML and then changing their `href` attribute? – FabioG Feb 17 '14 at 11:49
  • @FabioG we think google is able to interpret "simple" inserted links. Maybe I'm going to block google from accessing our already included JS files and put a simple function into it, altering the HTML links.. – Daniel W. Feb 17 '14 at 11:57
  • 1
    According to an article i read a while ago a simple `innerHTML`, for example, is interpreted by google bot if it's used inline in the HTML file but it won't be interpreted if it's included in an external JS file. – FabioG Feb 17 '14 at 12:06
  • This is what I'm planning to do.. but yet, if I put a link like `test` google does recognize that the actual page is what is mentioned in `data-href` (at least according to the SEO agency). I don't want to implement a mapping of links in an external file (we have about 200k links). I must put it into the ``, but encoded – Daniel W. Feb 17 '14 at 12:11
  • or you could also store those links in a xml and read them in the javascript and create the links on a loop for each one. – FabioG Feb 17 '14 at 12:34
  • This question appears to be off-topic because it is about SEO – John Conde Feb 17 '14 at 14:05
  • 1
    @JohnConde how many times do you copy & paste this comment? The focus of this question is on JavaScript, SEO is just the background specifying the circumstances. – Daniel W. Feb 17 '14 at 14:29

2 Answers2

1

This is what I ended up with:

Links look like:

<a href="#" onclick="linkAction(this); return false;" 
   data-href="uggc://fgntvat.....">

Where data-href is Rot13 encoded and linkAction does:

function linkAction(e) {
    window.location = rot13($(e).data('href'));
}

..in an external JS file.

I think this is the best obfuscation without performance overhead. Let's see what the SEO agency says :-)

P.S Rot13 taken from: Where is my one-line implementation of rot13 in JavaScript going wrong?

Community
  • 1
  • 1
Daniel W.
  • 31,164
  • 13
  • 93
  • 151
0

The thread is a bit abandoned, and the circumstances are a bit other too. The code cited at the beginning seems to be from the agency i was working for.

After becoming known the googlebot is a kind of Chrome and posts like this, there are not many approaches remaining to hide links from googlebot.

One approach seems to me to be promising - maybe, if somebody would find it promising too, it would be coded:

  • Step 1: publish a non-existing image
  • Step 2: write a javascript function, which finds links with special certain class="hidden" and redesigns urls in the href-attribute using a regex. With redesign i mean something like: replace . and/or / with |, % or *, or divide url in some parts with non-url characters, like | or *.
  • Step 3: write a second javascript function, which redesigns urls back to real urls,
  • Step 4: place both javascripts into external file and close it against crawling with X-Robots-Tag
  • Step 5: fire onError for non-existing image at least 6 seconds after onLoad (to be sure, that Googlebot goes away)
  • Step 6: trigger through onError the second javascript, which makes urls urls again

The approach could be maybe reverted, like: url redesign could be triggered immediately through onError, and Back-Redesign could be triggered through onClick.

Evgeniy
  • 2,337
  • 2
  • 28
  • 68