5

Need to execute the scala script through spark-shell with silent mode. When I am using spark-shell -i "file.scala", after the execution, I am getting into the scala interactive mode. I don't want to get into there.

I have tried to execute the spark-shell -i "file.scala". But I don't know how to execute the script in silent mode.

spark-shell -i "file.scala"

after execution, I get into

scala>

I don't want to get into the scala> mode

Updating (October 2019) for a script that terminates

This question is also about running a script that terminates, that is, a "scala script" that run by spark-shell -i script.scala > output.txt that stopts by yourself (internal instruction System.exit(0) terminates the script).
See this question with a good example.

It also needs a "silent mode", it is expected to not pollute the output.txt.

Suppose Spark v2.2+.


PS: there are a lot of cases (typically small tools and module/algorithm tests) where Spark interpreter can be better than compiler... Please, "let's compile!" is not an answer here.

Krzysztof Atłasik
  • 21,985
  • 6
  • 54
  • 76
Renganathan
  • 69
  • 1
  • 5
  • 1
    Please, cut down the repetition of content in your question, and use a spell checker in the future. – Dragonthoughts Aug 28 '19 at 07:24
  • 1
    create a jar and use spark-submit instead. – undefined_variable Aug 28 '19 at 07:53
  • REPL is not meant for this. If your `file.scala` is pure scala code then you can compile it using `scalac` and run the class file using `java` or `scala` interpreter. If it's a spark application, you can use spark-submit instead. – Goldie Aug 28 '19 at 08:33

3 Answers3

4

spark-shell -i file.scala keeps the interpreter open in the end, so System.exit(0) is required to be at the end of your script. The most appropriate solution is to place your code in try {} and put System.exit(0) in finally {} section.

If logging is requiered you can use something like this:

spark-shell < file.scala > test.log 2>&1 &

If you have limitations on editing file and you can't add System.exit(0), use:

echo :quit | scala-shell -i file.scala

UPD

If you want to suppress everything in output except printlns you have to turn off logging for spark-shell. The sample of configs is here. Disabling any kind of logging in $SPARK-HOME/conf/log4j.properties should allow you to see only pritnlns. But I would not follow this approach with printlns. Using general Logging with log4j should be used instead of printlns. You can configure it so obtain the same results as with printlns. It boils down to configuring a pattern. This answer provides an example of a pattern that solves your issue.

  • Hi. I say that use `System.exit(0)`... And "quit" is not "quiet", the bounty is about "quiet output"... – Peter Krauss Oct 31 '19 at 14:10
  • @PeterKrauss what do u imply under "quiet"? –  Oct 31 '19 at 14:13
  • Hi, "to quit" by `:q` **is not "be quiet"**. The question is about "quiet mode" (verbose=none)... Please read the question and question's bounty, *"See UPDATING 2019 section: is expected to not pollute the `output.txt`"* – Peter Krauss Oct 31 '19 at 15:22
  • @PeterKrauss you point logs to `output.txt` and want to keep only specific output? –  Oct 31 '19 at 17:41
  • Yes, only `println()`, no "verbose" of unsolicited outputs. – Peter Krauss Oct 31 '19 at 17:43
  • Hi Artem, you deserve the bounty for your effort and good clues, but I haven't been able to test your update ... I will probably do tests next week, or as soon as possible, then we discuss again, and I will probably edit here to include a more objective answer. – Peter Krauss Nov 01 '19 at 09:43
0

The best way is definitively to compile your scala code to a jar and use spark-submit but if you're simply looking for a quick iteration loop, you can simply issue a :quit after parsing your scala code:

echo :quit | scala-shell -i yourfile.scala
rluta
  • 6,717
  • 1
  • 19
  • 21
  • There are a scala command, to use into `yourfile.scala`, is `System.exit(0)`... But "to quit" or "to exit" **are not "be quiet"**. – Peter Krauss Oct 24 '19 at 15:29
0

Adding onto @rluta's answer. You can place the call to spark-shell command inside a shell script. Say the below in a shell script:

spark-shell < yourfile.scala

But this would require you to keep the lines of code within a line in case a statement is written on different lines.

OR

echo :quit | spark-shell -i yourfile.scala

This should

apnith
  • 315
  • 3
  • 15
  • Hi, "to quit" by `:q` **is not "be quiet"**. The question is about "quiet mode" (verbose=none)... Please read the question and question's bounty, *"See UPDATING 2019 section: is expected to not pollute the `output.txt`"* – Peter Krauss Oct 31 '19 at 15:22