A good start would be to read Google's documentation on this topic.
If you don't have any hashtags, then just block out this page in robots.txt as usual. Google should respect this but bear in mind that other crawlers, especially those lesser known might not.
Another idea that comes to mind is to check user-agent on your AJAX request. But then again, this doesn't prevent user-agent spoofing so there will still be a subset of rouge crawlers which are out to get your sensitive content.
You could probably find some other solution, perhaps a sort of smart JavaScript hack, which will prevent most crawlers from downloading your content but this approach will never be reliable or sustainable because ultimately, there are people committed 24/7 to making better crawlers.
If your goal is to make absolutely sure something is not indexed then it doesn't matter if it's AJAX or not. Any sensitive data needs to be hidden behind some sort of authentication or a Turing test like Captcha.