2

I have installed airflow with Postgres using docker compose. I am able to connect to Postgres from airflow defining connection in airflow website. Now I want to do different thing. I installed Postgres locally on my pc (not on docker). I would like by airflow running in docker to access pc's Postgres. How can I achieve that?

I've tried as follows to create test dag:

from airflow import DAG
from airflow.providers.postgres.hooks.postgres import PostgresHook
from airflow.operators.python import PythonOperator
from datetime import datetime
import psycopg2


def execute_query_with_psycopg(my_query, **kwargs):
    print(my_query)  # 'value_1'
    conn_args = dict(
        host='localhost',
        user='postgres',
        password='qaz',
        dbname='postgres',
        port=5432)
    conn = psycopg2.connect(**conn_args)
    cur = conn.cursor()
    cur.execute(my_query)

    for row in cur:
        print(row)


with DAG(dag_id="test2",
         start_date=datetime(2021, 1, 1),
         schedule_interval="@once",
         catchup=False) as dag:

    task1 = PythonOperator(
        task_id="test2_task",
        python_callable=execute_query_with_psycopg,
        op_kwargs={"my_query": 'select 1'})

task1

nevertheless I am getting following error:

conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
psycopg2.OperationalError: could not connect to server: Connection refused
    Is the server running on host "localhost" (127.0.0.1) and accepting
    TCP/IP connections on port 5432?
could not connect to server: Cannot assign requested address
    Is the server running on host "localhost" (::1) and accepting
    TCP/IP connections on port 5432?

[2022-05-31 18:30:06,005] {taskinstance.py:1531} INFO - Marking task as FAILED. dag_id=test2, task_id=test2_task, execution_date=20220531T175542, start_date=20220531T183005, end_date=20220531T183006
[2022-05-31 18:30:06,098] {local_task_job.py:151} INFO - Task exited with return code 1
Arie
  • 3,041
  • 7
  • 32
  • 63

2 Answers2

3

For anyone having same issue that's how I resolved that specifying host as follows:

host='host.docker.internal'
Arie
  • 3,041
  • 7
  • 32
  • 63
  • Thank you SO much for this! I've been looking for an answer for a week almost! However my situation is an opposite. I have Postgres and Airflow in docker and I couldn't connect to Postgres from my dag, – Vlad Vlad Jul 16 '23 at 16:41
0

Docker is not using the same network as your computer, meaning that once you run a PostgreSQL server locally, and a Docker image locally, they are not necessarily connected. When you started the images using Docker compose, the situation is different, because Docker compose creates a network bridge between the images that it starts.

Regarding how to connect your local PostgreSQL to the Airflow Docker image, you can try to consult the following question: Allow docker container to connect to a local/host postgres database

If you want your Airflow Docker image to use the network of your local PC (ALL network), you can start the Airflow container with the --network=host parameter, but use it with caution :)

alexkru
  • 16
  • 2