0

I wrote a Python script that allows me to retrieve calendar events from an externally connected source and insert them into my Google Calendar thanks to the Google Calendar's API. It works locally when I execute the script from my command line, but I would like to make it happen automatically so that the externally added events pop up in my Google Calendar automatically.

It appears that a cron job is the best way to do this, and given I used Google Calendar's API, I thought it might be helpful to use Cloud Functions with Cloud Scheduler in order to make it happen. However, I really don't know where to start and if this is even possible because accessing the API requires OAuth with Google to my personal Google account which is something I don't think a service account (which I think I need) can do on my behalf.

What are the steps I need to take in order to allow the script which I manually run and authenticates me with Google Calendar run every 60 seconds ideally in the cloud so that I don't need to have my computer on at all times?

Things I’ve tried to do:

I created a service account with full permissions and tried to create an http-trigger event that would theoretically run the script when the created URL is hit. However, it just returns an HTTP 500 Error. I tried doing Pub/Sub event targets to listen and execute the script, but that doesn’t work either.

Something I’m confused about:

with either account, there needs to be a credentials.json file in order to login; how does this file get “deployed” alongside the main function? Along with the token.pickle file that gets created when the authentication happens for the first time.

Linda Lawton - DaImTo
  • 106,405
  • 32
  • 180
  • 449
gellerm
  • 47
  • 1
  • 7
  • _What are the steps I need to take in order to allow the script which I manually run and authenticates me with Google Calendar run every 60 seconds ideally in the cloud so that I don't need to have my computer on at all times?_ Why not try to do it yourself, and ask a question only once you encounter an obstacle? As it stands, this is far too broad/vague. Please see [ask], [help/on-topic]. – AMC Jun 30 '20 at 01:30
  • @AMC I’ve edited to include what I’ve already tried doing...been wrestling with this for hours already – gellerm Jun 30 '20 at 01:38
  • If you want this to run in the cloud, is [Apps Script](https://developers.google.com/apps-script) not an option for you? It's designed to work well in tandem with Google Services, and even has [time-based triggers](https://developers.google.com/apps-script/guides/triggers/installable#time-driven_triggers) like cron. – Rafa Guillermo Jun 30 '20 at 08:07
  • @RafaGuillermo probably not unfortunately since Apps Script looks like it works in JavaScript and the API I use to retrieve data in the first place was written with a wrapper only for Python – gellerm Jun 30 '20 at 11:37
  • it's quite easy. 1/ throw away the python library and call the https endpoints directly. 2/ see https://stackoverflow.com/questions/19766912/how-do-i-authorise-an-app-web-or-installed-without-user-intervention/19766913#19766913 for how to create and use a refresh token – pinoyyid Aug 01 '20 at 12:46

1 Answers1

2

The way a service account works is that it needs to be preauthorized. You would take the service account email address and share a calendar with it like you would with any other user. The catch here being that you should only be doing this with calendars you the developer control. If these are calendars owned by others you shouldnt be using a service account.

The way Oauth2 works is that a user is displayed a consent screen to grant your application access to their data. Once the user has granted you access and assuming you requested offline access you should have a refresh token for that users account. Using the refresh token you can request a new access token at anytime. So the trick here would be storing the users refresh tokens in a place that your script can access it then when the cron job runs the first thing it needs to do is request a new access token using its refresh token.

So the only way you will be able to do this as a cron job is if you have a refresh token stored for the account you want to access. Other wise it will require it to open a web browser to request the users consent and you cant do that with a cron job.

Linda Lawton - DaImTo
  • 106,405
  • 32
  • 180
  • 449
  • Thank you for the detailed answer! If I create a token for my account manually (and have it be saved in a pickle file), can I reuse that for the cron job? – gellerm Jun 30 '20 at 11:35
  • 1
    yup as long as its a refresh token you can. Just test it i have no idea how Python works with loading the refresh token at run time. The library is probably designed to do it somehow you just need to figure that part out. Something like this maybe https://stackoverflow.com/a/37418906/1841839 sorry im not a python dev my assistance in this is limited. If you cant get it to work put up a new question related to loading a refresh token with the client libray – Linda Lawton - DaImTo Jun 30 '20 at 11:40
  • I see. I guess what I still don’t get is that google cloud functions appears to only deploy the function i wrote in my main.py file while the token is a separate file. I still am not sure how I would tell it to deploy the token alongside it or something like that – gellerm Jun 30 '20 at 11:41
  • you should be able to store the token as a variable in your mail.py and then load it from that. Security wise its probably case someone to have nightmares but lets not talk about that – Linda Lawton - DaImTo Jun 30 '20 at 11:48
  • it turns out I was terribly overthinking this. It read in the token just fine even though it was a separate file, however in the function itself I never included a parameter for actually accepting a request even though it was never used in the function i.e. I had `main()` when I really needed `main(request)`. Oh well! – gellerm Jul 04 '20 at 17:05