I have a DAG described like this :
tmpl_search_path = '/home/airflow/gcs/sql_requests/'
with DAG(dag_id='pipeline', default_args=default_args, template_searchpath = [tmpl_search_path]) as dag:
create_table = bigquery_operator.BigQueryOperator(
task_id = 'create_table',
sql = 'create_table.sql',
use_legacy_sql = False,
destination_dataset_table = some_table)
)
The task create_table
calls a SQL script create_table.sql
. This SQL script is not in the same folder as the DAG folder : it is in a sql_requests
folder at the same level as the DAG folder.
This is the architecture inside the bucket of the GCP Composer (which is the Google Airflow) is :
bucket_name
|- airflow.cfg
|- dags
|_ pipeline.py
|- ...
|_ sql_requests
|_ create_table.sql
What path do I need to set for template_searchpath to reference the folder sql_requests
inside the Airflow bucket on GCP ?
I have tried template_searchpath= ['/home/airflow/gcs/sql_requests']
, template_searchpath= ['../sql_requests']
, template_searchpath= ['/sql_requests']
but none of these have worked.
The error message I get is 'jinja2.exceptions.TemplateNotFound'