For a while we've been using https://github.com/PMassicotte/gtrendsR to pull Google trends data by keyword.
Our usual approach is to spin up a few different boxes, fire some requests, kill those boxes, then spin up others. We do this once a week.
We do all this in an attempt to adhere to the quota/limitations google has.
We were doing this for around 6+ months, but this week it has stopped working. Any box we use - even with different IPs - returns saying "HTTP 429 Too Many Requests".
All our boxes are spun up within our AWS account. Is google (or anyone) able to block different IPs if they're from the same "place"? This is what I believe is happening, but I don't fully get it.
I'm trying to learn/understand how this aspect of the internet works, and if this is the case then that's fine. It means we've reached the limitation of keeping all our scraping boxes within the same AWS cloud.
The error we're getting when using the gtrendsR package is:
Error in get_widget(comparison_item, category, gprop, hl, cookie_url, :
widget$status_code == 200 is not TRUE
There's a lot of discussion around this, with people over the last 4+ years doing different things to deal with it. For the most part it is usually: change IP.