I want to build an ETL pipeline that:
- Read files from the filesystem on-prem
- Write the file into a Cloud Storage bucket. Is it possible to import the files (regurarly, every day) directly with the Storage Transfer Service? Let's suppose I want to build the pipeline with Dataflow (with Python as programming language). Is it possible to implement such workflow? If yes, are there any Python exaples with Apache Beam?
Thank you in advance