Receiving below error in task logs when running DAG:
FileNotFoundError: [Errno 2] No such file or directory: 'beeline': 'beeline'
This is my DAG:
import airflow
from airflow import DAG
from airflow.providers.apache.hive.operators.hive import HiveOperator
from airflow.utils.dates import days_ago
from datetime import timedelta
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': airflow.utils.dates.days_ago(2),
'email': ['airflow@example.com'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5)
}
dag_data_summarizer = DAG(
dag_id="data_summarizer",
default_args=default_args,
description='Data summarizer DAG',
schedule_interval='*/20 * * * *',
start_date=airflow.utils.dates.days_ago(1)
)
hql_query = """create database if not exist new_test_db;"""
hive_task = HiveOperator(
hql=hql_query,
task_id="data_retrieval",
hive_cli_conn_id="new_hive_conn",
dag=dag_data_summarizer,
run_as_user="airflow" # airflow user has beeline executable set in PATH
)
if __name__ == '__main__':
dag_data_summarizer.cli()
The new_hive_conn
connection is of type "hive_cli" (tried with a connection type "hiveserver2" as well did not work)
The task log prints the below command:
beeline -u "jdbc:hive2://hive-server-1:10000/default;auth=NONE"
When I run this command on the worker docker container the command works and I am connected with the hive server.
The worker container has the beeline
executable configured and set on its PATH for the "airflow" and "root" users:
/home/airflow/.local/bin:/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/home/airflow/hive/apache-hive-2.3.2-bin/bin:/home/airflow/hadoop/hadoop-3.3.1/bin