1

I am getting a lot of spam refer urls in my analytics causing an increase in my site's bounce rate. Some of the urls are using sub domains such as site1.spamsite.com and site2.spamsite.com. What is the best way to block these? I have looked at .htaccess and robots.txt... Thought I would ask the best practice / solution before I implement. Thanks.

hjpotter92
  • 78,589
  • 36
  • 144
  • 183
jmiller
  • 578
  • 11
  • 28

2 Answers2

1

It does not matter whether they use subdomains or not. Try the following simplistic approach:

RewriteEngine On
RewriteCond %{HTTP_REFERER} ^https?://.*spamsite\.com/ [NC]
RewriteRule ^ – [F,L]
hjpotter92
  • 78,589
  • 36
  • 144
  • 183
  • Thanks. I'm assuming the * will act as a wildcard and block all urls from that domain? Can this be done with robots.txt? – jmiller Oct 30 '15 at 16:32
  • I've also seen this example elsewhere: `# free-share-button RewriteCond %{HTTP_REFERER} ^http://([^.]+\.)*free-share-button\.com [NC] RewriteRule (.*) http://www.free-share-button.com [R=301,L]` What's the different between `RewriteRule ^ – [F,L]` and this example? Thanks. – jmiller Oct 30 '15 at 16:42
  • In the `[F,L]` case, the user will be restricted to view your site and shown a HTTP 403 forbidden page. In the snippet you pasted; the user will be redirected back to `free-share-button.com` instead. – hjpotter92 Oct 31 '15 at 05:54
  • @hjptter92 Thanks for reply. I have successfully blocked about 80% of the referer urls however, there a still a couple getting through! Does htaccess take time to work? Can I use `deny` in robots.txt too? – jmiller Nov 01 '15 at 07:48
0

Tried various htaccess functions but kept getting sub domains getting through. Installed this plugin:

https://wordpress.org/plugins/block-referer-spam/

has all spam referrers listed already and is automatically updated everyday with the latest "black list"

jmiller
  • 578
  • 11
  • 28