14

I am trying to suppress the message

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

when i run my Spark app. I've redirected the INFO messages successfully, however this message keeps on showing up. Any ideas would be greatly appreciated.

Seagull
  • 2,219
  • 6
  • 25
  • 33
  • 1
    add your spark's conf folder to CLASSPATH variable. – Kaushal Jun 24 '15 at 09:59
  • @kaushal you're correct, I had to do something similar but not exactly, what you suggested here, but I am sure If i did what you suggested, this message would have gone. Thanks! – Seagull Jun 24 '15 at 21:50

5 Answers5

15

Even simpler you just cd SPARK_HOME/conf then mv log4j.properties.template log4j.properties then open log4j.properties and change all INFO to ERROR. Here SPARK_HOME is the root directory of your spark installation.

Some may be using hdfs as their Spark storage backend and will find the logging messages are actually generated by hdfs. To alter this, go to the HADOOP_HOME/etc/hadoop/log4j.properties file. Simply change hadoop.root.logger=INFO,console to hadoop.root.logger=ERROR,console. Once again HADOOP_HOME is the root of your hadoop installation for me this was /usr/local/hadoop.

quine
  • 982
  • 1
  • 12
  • 20
9

Okay, So I've figured out a way to do this. So basically, I had my own log4j.xml initially, that was being used, and hence we were seeing this property. Once I had my own "log4j.properties" file, this message went away.

Seagull
  • 2,219
  • 6
  • 25
  • 33
  • 2
    This is the exact solution if you're using Intellij. Great job. The other answers did not work for me. – Jason Wolosonovich Dec 27 '16 at 02:34
  • Can you please give me some details how did you get off the Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties .I m using spark 1.6 ,running the job on yarn cluser . Unable to figure out how to pass log4j in distributed envir. I have my own log4j propertie file – senthil kumar p May 08 '17 at 06:39
  • 4
    Think independent of local/distributed a log4j.properties need to exist in the classpath + you need to define a root logger there, like `log4j.rootLogger=INFO, console`. See https://github.com/apache/spark/blob/5264164a67df498b73facae207eda12ee133be7d/core/src/main/scala/org/apache/spark/internal/Logging.scala for reference! – oae Aug 24 '18 at 07:58
7

If you put a log4j.properties file under both the main/resources and the test/resources this also occurs. In this case, deleting the file from the test/resources and using only the file from the main/resources fixes the issue.

Danny Varod
  • 17,324
  • 5
  • 69
  • 111
  • You can just copy the contents of this file to your `src/main/resources/log4j.properties` file. https://github.com/apache/spark/blob/master/core/src/main/resources/org/apache/spark/log4j-defaults.properties – ekrich Feb 23 '21 at 00:35
2

None of the answers above did work for me using SBT. Turns out you need to explicitly define an appender in your log4j.properties, such as:

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{HH:mm:ss} %-5p %c{1}:%L - %m%n

log4j.rootLogger=WARN, stdout
log4j.logger.org.apache.spark=WARN, stdout
log4j.logger.com.yourcompany=INFO, stdout

Put this in your resources directory and Bob's your uncle!

386-DX
  • 21
  • 2
1

If you are using Spark-3.4 just rename your log4j.properties to log4j2.properties to make use of your logging configurations. Any changes I made to log4j.properties had no effect on logging.

smishra
  • 3,122
  • 29
  • 31