0

I have a website that works with both JS on and off. All links on the page are in regular format of the form <a href="/pagename">but if a visitor reaches it with JS available they will be modified into <a href="#/pagename"> and handled using hahshchange event.

This results in 2 possible URLs pointing to same content (www.site.com/pagename and www.site.com/#/pagename).

Note: If you reach www.site.com/pagename with JS on you will be automaticaly redirected to www.site.com/#/pagename

Now I'm wondering if I should implement the hashbang format (www.site.com/#!/pagename) or not since I don't know if this will result in duplicate content when crawled by bots? Google's FAQ wasn't of much help on this specific subject.

John Conde
  • 217,595
  • 99
  • 455
  • 496
Onyx47
  • 3
  • 1

1 Answers1

0

This probably will cause duplicate content issues but it's hard to say for sure since crawlable ajax is a new thing. But you can easily solve this by using canonical URLs.

John Conde
  • 217,595
  • 99
  • 455
  • 496
  • Ah, thanks for that link, I wasn't really sure what canonical URLs are before. Even without considering crawling this helps a lot since I can't really provide a reliable fallback for reaching the AJAX-ified URL if JS is disabled. – Onyx47 Nov 10 '11 at 15:33