I've edited the log4j.properties
file to set log4j.rootCategory=INFO, console
and the logging has stopped in spark-shell but it is unrelenting in pyspark.
The below code is working for me in pyspark. However I need to set it every time I open the pyspark console.
logger = sc._jvm.org.apache.log4j
logger.LogManager.getLogger("org").setLevel( logger.Level.OFF )
logger.LogManager.getLogger("akka").setLevel( logger.Level.OFF )
I'm looking for a permanent fix for this issue and want to understand how pyspark alone is picking up the default properties when spark-shell is not. I've checked this thread but couldn't find a solution.