0

I am working on my own web stats script.

Successfully eliminated many bots. Problem is only with tweet, when i tweet a link immediately i see 20-30 visits on URL that i suspect are cURL/unknown spiders

What is the efficient way of detecting such visitors? I want to improve visitors stats.

P.s. These visitors are not reported/seen in Google Analytics and for bots i remove these user agents from stats:

<?php 
array(
    "Butterfly","Twitturls","Me.dium","Twiceler","facebookexternalhit",
    "Teoma", "alexa", "froogle", "Gigabot", "inktomi",
    "looksmart", "URL_Spider_SQL", "Firefly", "NationalDirectory",
    "Ask Jeeves", "TECNOSEEK", "InfoSeek", "WebFindBot", "girafabot",
    "crawler", "www.galaxy.com", "Googlebot", "Scooter", "Slurp",
    "msnbot", "appie", "FAST", "WebBug", "Spade", "ZyBorg", "rabaz",
    "Baiduspider", "Feedfetcher-Google", "TechnoratiSnoop", "Rankivabot",
    "Mediapartners-Google", "Sogou web spider", "WebAlta Crawler","TweetmemeBot"
    );
?>

Thanks

Abbas Arif
  • 372
  • 4
  • 16
  • Possible duplicate of: [How to detect fake users ( crawlers ) and cURL](http://stackoverflow.com/questions/12257584/how-to-detect-fake-users-crawlers-and-curl) – Henry A. Mar 30 '16 at 18:48
  • My scenario is different, i just want to make my stats to be accurate. I do not want to block crawlers or stuff like that. – Abbas Arif Mar 30 '16 at 19:00

0 Answers0