Let me preface this with the fact that I am fairly new to Docker, Jenkins, GCP/Cloud Storage and Python.
Basically, I would like to write a Python app, that runs locally in a Docker container (alpine3.7 image) and reads chunks, line by line, from a very large text file that is dropped into a GCP cloud storage bucket. Each line should just be output to the console for now.
I learn best by looking at working code, I am spinning my wheels trying to put all the pieces together using these technologies (new to me).
I already have the key file for that cloud storage bucket on my local machine.
I am also aware of these posts:
- How to Read .json file in python code from google cloud storage bucket.
- Lazy Method for Reading Big File in Python?
I just need some help putting all these pieces together into a working app.
I understand that I need to set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the path of the key file in the container. However, I don't know how to do that in a way that works well for multiple developers and multiple environments (Local, Dev, Stage and Prod).