One of my sites has a few public pages that serve results based on the location the person is searching. Similar to going to a weather website and being able to search for your local weather. However for this, it is more about events in that vicinity and some other result types.
It seems from looking through a few index reports that these results are either completely ignored or only indexing one from the location reported by the bot.
Is there something I can/should do on the site or in the robots.txt file to help fix this?
I have thought about trying to detect a bot and then having it return all results, but I am concerned with the time. Each result has its own page, but we use GUIDs for Ids, so I cannot think of a way to have it just index the direct pages either? Would a separate page linking to all results in a list be helpful?
Thanks, tried Googling and asking friends and got nowhere.