Need to execute the scala script through spark-shell with silent mode. When I am using spark-shell -i "file.scala"
, after the execution, I am getting into the scala interactive mode. I don't want to get into there.
I have tried to execute the spark-shell -i "file.scala". But I don't know how to execute the script in silent mode.
spark-shell -i "file.scala"
after execution, I get into
scala>
I don't want to get into the scala>
mode
Updating (October 2019) for a script that terminates
This question is also about running a script that terminates, that is, a "scala script" that run by spark-shell -i script.scala > output.txt
that stopts by yourself (internal instruction System.exit(0)
terminates the script).
See this question with a good example.
It also needs a "silent mode", it is expected to not pollute the output.txt
.
Suppose Spark v2.2+.
PS: there are a lot of cases (typically small tools and module/algorithm tests) where Spark interpreter can be better than compiler... Please, "let's compile!" is not an answer here.