I have the following code designed to pull json data from a website and record it to a csv file:
def rec_price():
with urllib.request.urlopen('some_url') as url:
data = json.loads(url.read().decode())
df = pd.DataFrame(data)
df1 = df[['bpi','time']]
x = df1.loc['USD', 'bpi']['rate']
y = df1.loc['updated', 'time']
df2 = pd.DataFrame({'data': [x], 'time' : [y]})
df2['time'] = pd.to_datetime(df2['time'])
with open('out.csv', 'a') as f:
df2.to_csv(f, header=False)
I would like to run this code every 60 seconds, indefinitely. It seems like the two options available are to install apscheduler
or to use pythons standard import sched, time
module... I would like to know, what are the differences between the two modules? Is one better suited to the task? How would I implement the module?