2

I have a spark application running on AWS EMR. We have different environments on AWS like prod,uat,dev etc. I created application.conf file with required variables like s3 buket, iam role etc. but obviously these variables are different for each env.

How can I pass different conf file to spark-submit so that i don't have to change application.conf file for each environment during deployments?

hlagvankar
  • 219
  • 1
  • 3
  • 12
  • 2
    Possible duplicate of [specific config by environment in Scala](http://stackoverflow.com/questions/21607745/specific-config-by-environment-in-scala) – puhlen Apr 18 '17 at 13:17
  • I have application.conf and uat.conf. I want to pass uat.conf when running SPARK job in UAT env but I can't do it with -Dconfig.resource option with spark-submit command. – hlagvankar Apr 18 '17 at 23:00
  • @puhlen - Please note this is not a duplicate of the issue you mention. Yes, both talk about application.conf, but the command line option `-D` does not work with spark-submit!!! – iyerland Jun 16 '17 at 08:19

1 Answers1

0

Based on the answer given by @ozeebee in this post, we can use it for spark-submit also.

In spark-submit you need to specify the properties in spark.driver.extraJavaOptions, something like this:

spark-submit 
--conf "spark.driver.extraJavaOptions=-Dconfig.resource=devtest.conf" 
--class thesyscat.query.ingest.eventenrichment.EventEnrichSparkApp 
--master yarn 
--deploy-mode client 
<jar_location> 
Rajat Mishra
  • 3,635
  • 4
  • 27
  • 41