0

I am using 'pyspark' package to initialize spark context and when I do that, I could see lot of log info displayed. Is there any option / command available to avoid this ?

Code snippet:

from pyspark import SparkContext, SparkConf;
from pyspark.sql import SQLContext;
conf = SparkConf().setAppName("test").setMaster("local")
sc = SparkContext(conf=conf)

enter image description here

Thanks in advance :)

Ramkumar
  • 444
  • 1
  • 7
  • 22
  • Thanks for your reply @Stiffo.... I have modified the log4j.properties, but this stops displaying log info in spark scala shell too... Is there any other options available ? – Ramkumar Jul 31 '15 at 09:27

1 Answers1

-1

Answer courtesy of user AkhlD:

Edit your conf/log4j.properties file and Change the following line:

   log4j.rootCategory=INFO, console

to

    log4j.rootCategory=ERROR, console

Another approach would be to :

Fireup spark-shell and type in the following:

import org.apache.log4j.Logger
import org.apache.log4j.Level

Logger.getLogger("org").setLevel(Level.OFF)
Logger.getLogger("akka").setLevel(Level.OFF)

You won't see any logs after that.

Stiffo
  • 818
  • 6
  • 19