the requests call is :
res = requests.post(f"{API_BASE_URL}/dags/sql_data_transfer/dagRuns", headers={
'Content-Type': 'application/json',
'Authorization': f'Bearer {TEST_TOKEN}'},
json={
"dag_run_id": dag_run_id,
'conf': conf
}
)
This works locally but in gitlab it fails using localhost
, 127.0.0.1
, docker
as host.
The error:
if isinstance(e.reason, _ProxyError):
raise ProxyError(e, request=request)
if isinstance(e.reason, _SSLError):
# This branch is for urllib3 v1.22 and later.
raise SSLError(e, request=request)
> raise ConnectionError(e, request=request)
E requests.exceptions.ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=8080): Max retries exceeded with url: /api/v1/dags/sql_data_transfer/dagRuns (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f8e98491bd0>: Failed to establish a new connection: [Errno 111] Connection refused'))
/home/airflow/.local/lib/python3.7/site-packages/requests/adapters.py:519: ConnectionError
docker-compose.yml:
version: '2.2'
x-airflow-common:
&airflow-common
image: airflow
build: airflow
environment:
- AIRFLOW__CORE__EXECUTOR=LocalExecutor #
- AIRFLOW__CORE__LOAD_EXAMPLES=False
- AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION=True
- AIRFLOW__LOGGING__LOGGING_LEVEL=INFO
volumes:
- ./airflow/dags:/opt/airflow/dags
- ./airflow/airflow-data/logs:/opt/airflow/logs
- ./airflow/airflow-data/plugins:/opt/airflow/plugins
- ./airflow/config/airflow.cfg:/opt/airflow/airflow.cfg
user: "${AIRFLOW_UID:-50000}:0"
depends_on:
- postgres
services:
airflow-observer:
<< : *airflow-common
container_name: airflow_observer
command: airflow webserver
ports:
- 8082:8080