data source is from SaaS Server's API endpoints, aim to use python to move data into AWS S3 Bucket(Python's Boto3 lib) API is assigned via authorized Username/password combination and unique api-key. then every time initially call API need get Token for further info fetch.
have 2 question:
- how to manage those secrets above, save to a head file (*.ini, *.json *.yaml) or saved via AWS's Secret-Manager?
- the Token is a bit challenging, the basically way is each Endpoint, fetch a new token and do the API call then that's end of too many pipeline (like if 100 Endpoints info need per downstream business needs) then need to craft 100 pipeline like an universal template repeating 100 times.
I am new to Python programing world, you all feel free to comment to share any user-case. Much appreciate !!
I searched and read this show-case
[saving-from-api-to-s3-bucket/74648533] saving from api to s3 bucket
and "how-to-write-a-file-or-data-to-an-s3-object-using-boto3" How to write a file or data to an S3 object using boto3