1

I am trying to build an application that uses log4j to make log files. The location of the log file is provided throught log4j.properties file.

So far, it works well but I want to separate the spark logs from the logs that I generate from my code. Or at least just print my log messages in the log file.

Is there any way to do that?

log4j.properties

# Root logger option
log4j.rootLogger=INFO, stdout, file
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n

# Redirect log messages to a log file
log4j.appender.file=org.apache.log4j.rolling.RollingFileAppender
log4j.appender.file.rollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy
log4j.appender.file.rollingPolicy.fileNamePattern=../log/abc%d{yyyyMMdd_HHmmss}.log
    log4j.appender.file.TriggeringPolicy=org.apache.log4j.rolling.SizeBasedTriggeringPolicy
log4j.appender.file.TriggeringPolicy.maxFileSize=5000000
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
Amber
  • 914
  • 6
  • 20
  • 51
  • To just hide spark logging you can set the logger for org.apache.spark to a high level in your log4j config. eg ERROR. Or you could define a different log appender which outputs to a different location and assign that to org.apache.spark – A Spoty Spot Oct 24 '16 at 12:12
  • @ASpotySpot I have added the config file for log4j in the question. Could you suggest how I can define a different log file for the spark logs? – Amber Oct 24 '16 at 12:31
  • are you using submitting spark job on yarn cluster.if yes can you please provide the commands what you have used while spark submit. – senthil kumar p May 10 '17 at 13:28

2 Answers2

2

You can easily define different appenders for different packages you want to log differently or with different log4j appenders.

Example of log4j.properties:

# Set root logger level to DEBUG and its only appender to A1.
log4j.rootLogger=INFO, A1, sparkappender

# A1 is set to be a ConsoleAppender.
log4j.appender.A1=org.apache.log4j.ConsoleAppender

# A1 uses PatternLayout.
log4j.appender.A1.layout=org.apache.log4j.PatternLayout
log4j.appender.A1.layout.ConversionPattern=%-4r [%t]%-5p %c %x - %m%n

# org.apache.spark package will log TRACE logs
log4j.logger.org.apache.spark=TRACE, sparkappender
log4j.logger.org.spark_project.jetty=ERROR, sparkappender
log4j.additivity.org.apache.spark=false

log4j.appender.sparkappender=org.apache.log4j.DailyRollingFileAppender
log4j.appender.sparkappender.datePattern='-'dd'.log'
log4j.appender.sparkappender.File=log/spark-logs.log
log4j.appender.sparkappender.layout=org.apache.log4j.PatternLayout
log4j.appender.sparkappender.layout.ConversionPattern=%-4r [%t]%-5p %c %x - %m%n

Quick explanation of the file above:

With sparkappender appender log4j will log all logs into file log/spark-logs.log. All logs not produced by class within package org.apache.spark or org.apache_project.jetty are logged into the console. These packages have also different log level but the same log4j appender - sparkappender

In your example you can keep file log appender and define log level and sparkappender to log4j.logger like in example above.

VladoDemcak
  • 4,893
  • 4
  • 35
  • 42
0

Unfortunately I'm not sure how to do it using the properties file. I have only configured log4j using xml.

However in XML you could add the following to a 'standard' config:

<appender name="spark-file" class="org.apache.log4j.FileAppender">
    <param name="File" value="spark.log" />
    <param name="Append" value="true" />
        <layout class="org.apache.log4j.PatternLayout">
            <param name="ConversionPattern" value="%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n" />
        </layout>
</appender>

This defines an appender that will output logs to a file called spark.log. Then to use it:

<logger name="org.apache.spark" additivity="false">
    <level value="INFO" />
    <appender-ref ref="spark-file" />
</logger>

I expect lookinginto the docs on how to convert this to use the properties format 'shouldn't' be too difficult.

A Spoty Spot
  • 746
  • 6
  • 19
  • Converting this to .properties format wouldn't be a problem. But wouldn't this just set the output file for the logs as opposed to set separate output files for spark logs and my custom logs in the code? – Amber Oct 24 '16 at 13:04
  • You also have your default appender which still appends to console. Here though you use appender ref rather than falling back to root logger to reference that for the spark package you want to use spark-file appender. I think that is roughly how it works. You may get duplicates logs as your spark logs may also appear in your root logs. In which case http://stackoverflow.com/questions/13627235/log4j-multiple-loggers-levels-and-appenders may help. – A Spoty Spot Oct 24 '16 at 13:30