0

I am running a scala Spark application to be submitted vi spark-submit:

spark-submit --class "com.foo.bar.MyClass" \
         --master yarn \
         --driver-memory 1g \
         --executor-memory 1g \
         --num-executors 2 \
         --executor-cores 2 \
         --jars <path-to>/MyJar.jar \
                <path-to>/MyJar.jar

I have tried just about every configuration of log4j I can think of or found here, here, here and here, among others. I have put into my code the lines:

Logger.getRootLogger().setLevel(Level.WARN)

Among other lines, trying to suppress just individual classes.

I have also put a line of output in my Main to prove what level it is at:

println("Log level = " + LogManager.getRootLogger.getLevel())

It will show whatever I change the values to. If I change the setlevel line of code above, it always prints out that value, if I take the line of code out, it will print whatever I have in the SPARK_HOME/conf/log4j.properties file, and if I addd the --conf "spark.driver.extraJavaOptions" lines, it will show what I put in there.

Bottom line, I can change any of those settings, and the application prints out what the log level is set at, but regardless, I get MBs of logging from Spark.

Any other suggestions on how to disable all the verbose logging from Spark?

ksdaly
  • 5
  • 1
  • 7

1 Answers1

1
  1. Create log4j-configurations.properties from log4j.properties/template
  2. Add it under conf or configurations folder of your project
  3. Add the lines below to the script using spark-submit:


--conf 'spark.executor.extraJavaOptions=-Dlog4j.configuration=prop/file/location'\
--conf 'spark.driver.extraJavaOptions=-Dlog4j.configuration=prop/file/location'\
shreya
  • 11
  • 1