0

I'm currently coding a view / visitor counter, which checks if the visitor is a human or a bot / crawler. I already found some solutions, which I use. One of these is a cookie (which is set with Javascript), but some Bots already allows to set cookies, and some humans doesn't. :/

Now I'm questioning if it's maybe more effective to set something in the HTML 5 storage. It's maybe more unlikely that bots supports the HTML 5 storage technology as cookies, or? And is it possible to disable the HTML 5 storage on any (human) browser? I haven't found any option yet.

Thanks, Sam.

PS.: Some useful / informative stuff I already know / use within my script:

SamBrishes
  • 107
  • 1
  • 9

2 Answers2

1

Crawlers don't normally execute Javascript. So you can monitor visits with JS. You can also filter out the user agents of bots.

Here is the first list I found with crawler user agents.

https://deviceatlas.com/blog/list-of-web-crawlers-user-agents

Pandelis
  • 1,854
  • 13
  • 20
1

I used this tool to render the page via Google-bot and the result is that Google-bot supports HTML 5 Storage:

The code to test the storage supprot: https://codepen.io/gab/pen/AxFoB

this code uses this code to detect:

/* Detect browser can use web storage */
if (!typeof(Storage) !== 'undefined') {
  $('#yay').fadeIn('slow');
} else {
  $('#ooh').fadeIn('slow');
}

The tool to fetch and render as bot: https://technicalseo.com/seo-tools/fetch-render/

the result of render: enter image description here

Ali Sheikhpour
  • 10,475
  • 5
  • 41
  • 82