I'm working on Apache Spark application which I submit to AWS EMR cluster from Airflow task.
In Spark application logic I need to read files from AWS S3 and information from AWS RDS. For example, in order to connect to AWS RDS on PostgreSQL from Spark application, I need to provide the username/password for the database.
Right now I'm looking for the best and secure way in order to keep these credentials in the safe place and provide them as parameters to my Spark application. Please suggest where to store these credentials in order to keep the system secured - as env vars, somewhere in Airflow or where?