I know this is a trivial question, but I could not find the answer on the internet.
I am trying to run a Java class with the main
function with program arguments (String[] args
).
However, when I submit the job using spark-submit
and pass program arguments as I would do with
java -cp <some jar>.jar <Some class name> <arg1> <arg2>
it does not read the arg
s.
The command I tried running was
bin/spark-submit analytics-package.jar --class full.package.name.ClassName 1234 someargument someArgument
and this gives
Error: No main class set in JAR; please specify one with --class
and when I tried:
bin/spark-submit --class full.package.name.ClassName 1234 someargument someArgument analytics-package.jar
I get
Warning: Local jar /mnt/disk1/spark/1 does not exist, skipping.
java.lang.ClassNotFoundException: com.relcy.analytics.query.QueryAnalytics
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:176)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:693)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:122)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
How can I pass these arguments? They change frequently on each run of the job, and they need to be passed as arguments.