0

Short: How to extract data from postgres and load to Google cloud postgres using google cloud SQL?


I have An airflow dag that extracts data from postgres using sqoop and stores data in AWS cloud

  • is there any operator in airflow that would help to extract data from on premise datavase and load directly to postgres database on Google cloud?
  • Or Can i reuse data in AWS cloud and put it directly to Google cloud database?
  • Or do i need to extract CSV file form RDBMS on premise and use operators in Airflow to insert it in target table in cloud.
y2k-shubham
  • 10,183
  • 11
  • 55
  • 131
Bommu
  • 229
  • 1
  • 4
  • 14
  • 2
    I suggest [postgres_to_gcs_operator](https://airflow.apache.org/docs/stable/_api/airflow/contrib/operators/postgres_to_gcs_operator/index.html) then [CloudSqlInstanceImportOperator](https://airflow.apache.org/docs/stable/_api/airflow/contrib/operators/gcp_sql_operator/index.html#airflow.contrib.operators.gcp_sql_operator.CloudSqlInstanceImportOperator). – Hitobat May 11 '20 at 10:11
  • If you write a custom operator, you can use `PostgresHook` as told [here](https://stackoverflow.com/a/61562437/3679900) – y2k-shubham May 11 '20 at 16:37

0 Answers0