A similar issue has been reported for RgoogleMaps::getGeoCode()
, which was linked to Google's rate limiting. Since geocode()
also relies on the Google Maps API (unless source = "dsk"
), this limiting is likely causing problems here as well.
You can easily solve this the "stubborn" way by iterating through all locations of interest (eg. using for
or *apply
) rather than passing one large vector of addresses to geocode
at once. Inside the loop, you can then use while
to detect whether coordinates were successfully retrieved for the currently processed location and, if not, simply repeat the geocoding procedure until it succeeds.
out = lapply(d, function(i) {
gcd = geocode(i)
while (all(is.na(gcd))) {
gcd = geocode(i)
}
data.frame(address = i, gcd)
})
For example, during my last test run, the retrieval failed three times as indicated by the following warnings (this will likely look different on your machine):
Warning messages:
1: geocode failed with status OVER_QUERY_LIMIT, location = "Via del Tritone 123, 00187 Rome, Italy"
2: geocode failed with status OVER_QUERY_LIMIT, location = "Via del Tritone 123, 00187 Rome, Italy"
3: geocode failed with status OVER_QUERY_LIMIT, location = "Via dei Capocci 4/5, 00184 Rome, Italy"
Nonetheless, thanks to the while
condition included inside the outer loop structure, coordinates were finally successfully retrieved for all locations of interest:
> do.call(rbind, out)
address lon lat
1 Via del Tritone 123, 00187 Rome, Italy 12.48766 41.90328
2 Via dei Capocci 4/5, 00184 Rome, Italy 12.49321 41.89582
As an additional treat, this "stubborn" approach can easily be run in parallel (eg. using parLapply()
or foreach()
), which might result in considerable speed gains when querying a larger number of addresses.