2

I have a spark job written in scala. I use:

arguments=$@
spark-shell -i <file-name-with-main, 
        auxiliary-file-name1, auxiliary-file-name2> 
    --master yarn-client 
    --driver-memory 2G 
    --executor-memory 4G 
    --num-executors 10   
    --executor-cores 4 
 <(echo 'val args = "'$arguments'".split("\\s+")' ; 
   echo "main(args)"; 
   echo ':q')

to run the job. I got the idea from Passing command line arguments to Spark-shell. But I need to include echo "main(args)"; echo 'sys.exit' for it to work, and as otherwise it hangs on the scala terminal.

Why is that? Is there a better way to do it?

Community
  • 1
  • 1
Ana
  • 85
  • 1
  • 8

0 Answers0