2

One of my sites has a few public pages that serve results based on the location the person is searching. Similar to going to a weather website and being able to search for your local weather. However for this, it is more about events in that vicinity and some other result types.

It seems from looking through a few index reports that these results are either completely ignored or only indexing one from the location reported by the bot.

Is there something I can/should do on the site or in the robots.txt file to help fix this?

I have thought about trying to detect a bot and then having it return all results, but I am concerned with the time. Each result has its own page, but we use GUIDs for Ids, so I cannot think of a way to have it just index the direct pages either? Would a separate page linking to all results in a list be helpful?

Thanks, tried Googling and asking friends and got nowhere.

RiddlerDev
  • 7,370
  • 5
  • 46
  • 62
  • There are several technics for sorting SERP. You might wanna look into [structured data](https://developers.google.com/search/docs/guides/intro-structured-data). There are also schemas for [Events](https://schema.org/Event). You should also use canonical urls and all kind of semantic data in the HTML itself, like `lang` attributes. – Daniel W. Mar 04 '20 at 20:41
  • But for that to work, it would have to load all events right? The tagging is fine.. but since we limit results based on location and search radius, it sounds like that would be an issue still. That is why I was wondering if I needed a way to load results or have a landing page only seen by robots? – RiddlerDev Mar 04 '20 at 20:57
  • If you want 10 events from country A and 10 events from country B to be shown on search result pages, you must have 20 different URLs and all should have their canonical and semantic / structured data. Put those pages up in a `sitemap.xml` (better make one for each language/country). So Google does fetch the pages from the sitemap and not by crawling. – Daniel W. Mar 04 '20 at 21:27
  • Don't cloak your content. Means: Don't show Google something that is different from what you present to the normal visitor. It most probably results in negative SEO effects for your whole project. – Daniel W. Mar 04 '20 at 21:31
  • Makes sense, sitemap.xml is a great solution. Not sure why I was not thinking about it. Agreed on the second point, was worried about that. If you want to add this a solution, I can accept it. Thanks! – RiddlerDev Mar 04 '20 at 21:35

0 Answers0