0

How can I execute a shell script in jupyter and ignore its logged outputs to standardout and standarderr?

!sh my_shell_file.sh > /dev/null

Does not work and is still logging plenty of logs.

background

I want to silence the logs of spark: How to stop INFO messages displaying on spark console?

and not even:

spark.sparkContext.setLogLevel("FATAL")

or

spark.sparkContext.setLogLevel("ERROR")

worked when supplying a:

spark-shell -i my_scala.scala

with contents of

spark.sparkContext.setLogLevel("ERROR")
spark.sparkContext.setLogLevel("FATAL")

// do some stuff
// generating plenty of logs

edit

Manually setting:

import org.apache.log4j.Logger
import org.apache.log4j.Level
Logger.getRootLogger.setLevel(Level.ERROR)
Logger.getRootLogger.setLevel(Level.FATAL)

Logger.getLogger("org").setLevel(Level.WARN)

fails as well to silence the logs.

20/11/05 15:18:16 WARN net.ScriptBasedMapping: Exception running /etc/hadoop/conf/topology_script.py 10.15.250.71 
ExitCodeException exitCode=1:   File "/etc/hadoop/conf/topology_script.py", line 63
    print rack
          ^
SyntaxError: Missing parentheses in call to 'print'. Did you mean print(rack)?

    at org.apache.hadoop.util.Shell.runCommand(Shell.java:1008)
    at org.apache.hadoop.util.Shell.run(Shell.java:901)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1213)

is still logged thousandas of times.

Georg Heiler
  • 16,916
  • 36
  • 162
  • 292

1 Answers1

0

A simple: &>/dev/null How do I suppress output from this Shell command will solve it and also ignore the standarderror output of spark.

Georg Heiler
  • 16,916
  • 36
  • 162
  • 292