How can I turn off pyspark logging from a python script? Pls Note : I do not want to make any changes in the spark logger properties file.
Asked
Active
Viewed 2,921 times
1 Answers
6
To remove (or modify) logging from a python script:
conf = SparkConf()
conf.set('spark.logConf', 'true') # necessary in order to be able to change log level
... # other stuff and configuration
# create the session
spark = SparkSession.builder\
.config(conf=conf) \
.appName(app_name) \
.getOrCreate()
# set the log level to one of ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN
spark.sparkContext.setLogLevel("OFF")
Hope this helps, good luck!
Edit: For earlier versions, e.g. 1.6, you can try something like the following, taken from here
logger = sc._jvm.org.apache.log4j
logger.LogManager.getLogger("org"). setLevel(logger.Level.OFF)
# or
logger.LogManager.getRootLogger().setLevel(logger.Level.OFF)
I haven't tested it unfortunately, please, let me know if it works.

mkaran
- 2,528
- 20
- 23
-
I am working on spark 1.6, which doesnt support SparkSession :( Could you pls suggest some other solution. – eiram_mahera Nov 10 '17 at 09:42
-
@eiram_mahera edited the answer, let me know if it works for you :) – mkaran Nov 10 '17 at 10:06