3

I've edited the log4j.properties file to set log4j.rootCategory=INFO, consoleand the logging has stopped in spark-shell but it is unrelenting in pyspark.

The below code is working for me in pyspark. However I need to set it every time I open the pyspark console.

logger = sc._jvm.org.apache.log4j
logger.LogManager.getLogger("org").setLevel( logger.Level.OFF )
logger.LogManager.getLogger("akka").setLevel( logger.Level.OFF )

I'm looking for a permanent fix for this issue and want to understand how pyspark alone is picking up the default properties when spark-shell is not. I've checked this thread but couldn't find a solution.

Community
  • 1
  • 1
Vinay
  • 1,473
  • 4
  • 14
  • 24

1 Answers1

0

This worked for me:

sc.setLogLevel("OFF")

Which means you need to set the log level OFF using spark context object.

eiram_mahera
  • 950
  • 9
  • 25