I have installed airflow with Postgres using docker compose. I am able to connect to Postgres from airflow defining connection in airflow website. Now I want to do different thing. I installed Postgres locally on my pc (not on docker). I would like by airflow running in docker to access pc's Postgres. How can I achieve that?
I've tried as follows to create test dag:
from airflow import DAG
from airflow.providers.postgres.hooks.postgres import PostgresHook
from airflow.operators.python import PythonOperator
from datetime import datetime
import psycopg2
def execute_query_with_psycopg(my_query, **kwargs):
print(my_query) # 'value_1'
conn_args = dict(
host='localhost',
user='postgres',
password='qaz',
dbname='postgres',
port=5432)
conn = psycopg2.connect(**conn_args)
cur = conn.cursor()
cur.execute(my_query)
for row in cur:
print(row)
with DAG(dag_id="test2",
start_date=datetime(2021, 1, 1),
schedule_interval="@once",
catchup=False) as dag:
task1 = PythonOperator(
task_id="test2_task",
python_callable=execute_query_with_psycopg,
op_kwargs={"my_query": 'select 1'})
task1
nevertheless I am getting following error:
conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
psycopg2.OperationalError: could not connect to server: Connection refused
Is the server running on host "localhost" (127.0.0.1) and accepting
TCP/IP connections on port 5432?
could not connect to server: Cannot assign requested address
Is the server running on host "localhost" (::1) and accepting
TCP/IP connections on port 5432?
[2022-05-31 18:30:06,005] {taskinstance.py:1531} INFO - Marking task as FAILED. dag_id=test2, task_id=test2_task, execution_date=20220531T175542, start_date=20220531T183005, end_date=20220531T183006
[2022-05-31 18:30:06,098] {local_task_job.py:151} INFO - Task exited with return code 1