I am doing some geocoding of street addresses (n=18,000) using the ggmap package in R and the Google Maps API, which I understand has a limit of 2,500 geocoding requests per day for addresses.
The geocoding script I'm using is very simple and works on the small test dfs I've tried (like the sample below), but I'm wondering about the most simple/elegant way to stitch together the final geocoded df of all 18,000 locations over the next ~7 days for each 2500-row chunk.
I'd thought about just numbering them by day and then binding them all together at the end, using the following line of code each time on a df that looks like the sample below:
library(ggmap)
library(tidyverse)
register_google(key = "MY API KEY", write = TRUE)
pharmacies <- data.frame(pharm_id = c("00001", "00002", "00003"), address = c("250 S. Colonial Drive, Alabaster, AL 35007", "6181 U.S. Highway 431, Albertville, AL 35950", "113 Third Avenue S.E., Aliceville, AL 35442")
pharmacies_geocoded_1 <- mutate_geocode(pharmacies, address, output = "latlon")
pharm_id | address |
---|---|
00001 | 250 S. Colonial Drive, Alabaster, AL 35007 |
00002 | 6181 U.S. Highway 431, Albertville, AL 35950 |
00003 | 113 Third Avenue S.E., Aliceville, AL 35442 |
But it seems like manually doing this day by day will get a bit messy (or that there may be some more elegant loop strategy that I can set up once and walk away from). Is there a better way?