Since you want, that the Page is still crawled, robots.txt
could be no option for you.
Generaly, you shoud ask, is your implementation of the API right? You should use an API to obtain some data or perform some operation.
What you should not do, is asking the API every PageView the same Information. Cache it instead.
Sometimes it is ok, to cache simple the result in a txt file, Sometimes you want to crawl data into your own Database.
If this is no option for you, you can detect the google bot this way:
if(strstr(strtolower($_SERVER['HTTP_USER_AGENT']), "googlebot"))
{
// what to do
}
Give at least the Googlebot a cached version.
Also note, this is not a Googlebot only Problem. There are many bots out there. And there are also bad Bots, which pose as a normal User. Also If you have heavy load, this could be a problem too.