0

I wanted to generate conn_id for spark_default. I am running my airflow on k8s and I wanted to generate the conn_id on the fly with spark master which is another container running in the same namespace.

Is there way to generate the conn_id on the fly: something like:

  • env variables
  • or use the SparkSubmitOperator itself to write and generate the conn_id

Here is my dag code:

from airflow import DAG

from airflow.contrib.operators.spark_submit_operator import SparkSubmitOperator
from datetime import datetime, timedelta


args = {
    'owner': 'airflow',
    'start_date': datetime(2019, 5, 22)
}
dag = DAG('spark_example_new', default_args=args, schedule_interval="*/10 * * * *")

operator = SparkSubmitOperator(
    task_id='spark_submit_job_from_airflow',
    conn_id='spark_default',
    java_class='org.apache.spark.examples.JavaWordCount',
    application='local:///opt/spark/examples/jars/spark-examples_2.12-2.4.1.jar',
    total_executor_cores='1',
    executor_cores='2',
    executor_memory='2g',
    num_executors='1',
    name='airflow-spark-example-coming-from-aws-k8s',
    verbose=True,
    driver_memory='1g',
    application_args=["/opt/spark/data/graphx/users.txt"],
    dag=dag,
)
devnull
  • 161
  • 15

1 Answers1

1

You can try review this answer

from airflow.models import Connection
from airflow import settings

def create_conn(username, password, host=None):
    new_conn = Connection(conn_id=f'{username}_connection',
                                  login=username,
                                  host=host if host else None)
    new_conn.set_password(password)

    session = settings.Session()
    session.add(new_conn)
    session.commit()