I have a python code which reads data from an API and creates a json (its not just a simple read , there are same transformations as well)
I need to get the data into GCP (specifically cloud storage) and it needs to be run once every 24 hours. Google cloud function seems to be the ideal solution but it has a quota limit of 9 minutes. So the code does'nt work there.
What other options do I have in GCP Dataflow ? Can I use my standard python code in the beam framework ? Datafusion ? I doubt.
Any other suggestions ?
P.S. This is my first question in stackoverflow so please let me know if the format of my question is incorrect or if it can be improved upon.