Whenever the DAG is triggered using a custom JSON it should use those values in the {{ params }}
. I'd like to send this dictionary with all it's keys and sub-dicts to a task which will process and check those values if they are correct.
I tried sending it without transforming, using json.loads
, replacing the whitespace, it seems like nothing is working as argparse
can't recognize the argument even though it works locally. Somehow airflow doesn't respect the changes I tried to make to this value.
How should I change the code in the DAG to be able to send the {{ params }}
through argparse?
This is how the two scripts look using only minimal code (also without imports):
airflow dag:
with DAG(
"pipeline",
render_template_as_native_obj=True, # so that {{ params }} is a dict
) as dag:
prevalidation = KubernetesPodOperator(
task_id="pre-validation",
name="pre-validation",
cmds=["python3"],
arguments=[
"-m",
"prevalidation",
"--config",
"{{ params }}", # the params dict
],
)
prevalidation.py:
if __name__ == "__main__":
parser = ArgumentParser()
parser.add_argument("--config")
args, unknown = parser.parse_known_args()
# won't get here when using from DAG, locally it works