I'm currently trying to retrieve the contents of a webpage (JSON) that updates very regularly. Ideally, I would like to download the data from this URL every 10 minutes and save it, appending it to a growing data structure of updates.
Is there a way to repeatedly run a python function at a user-defined interval (10 minutes)?
In my case, I'm able to download a url using the below code, but wish to add data to it every 10 minutes.
with urllib.request.urlopen(URL) as url:
data = json.loads(url.read().decode())
print(data)