1

I'm trying to let my spark application to log in its own log file, I don't want all my stuff mixed up with that of Spark, it's not readable.

I renounced to use Logback due to Spark libraries compatibilities, so I turned my choice to log4j. I created my custom log4j.properties in src/main/resources of my java application, but when I launched the spark-submit of my jar, all my logs got written in Spark's worker log file. It seams the custom log4j.properties inside my jar got ignored.

This is the command:

./spark-submit --jars /home/user/LIBRERIE/ORACLE/ojdbc8.jar,\
/home/user/.m3/repository/org/mongodb/spark/mongo-spark-connector_2.11/2.3.0/mongo-spark-connector_2.11-2.3.0.jar,\
/home/user/.m3/repository/org/mongodb/mongo-java-driver/3.8.1/mongo-java-driver-3.8.1.jar,\
/home/user/.m3/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar \
--class my.pkg.common.SparkHandlerStandalone \
--master  spark://162.16.215.59:7077 \
--deploy-mode cluster \
/home/user/NetBeansProjects/SparkScala/target/SparkScala-1.0-SNAPSHOT.jar

My log4j.properties:

log4j.rootLogger=DEBUG, file

log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=/home/user/TEMP/Spark/sparkapp.log
log4j.appender.file.MaxFileSize=5MB
log4j.appender.file.MaxBackupIndex=10
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSS} %-5p %c{1}:%L - %m%n

Does anyone know how I can separate the two logs?

Stefania
  • 641
  • 1
  • 12
  • 34
  • Did you check answers in this post- https://stackoverflow.com/questions/27781187/how-to-stop-info-messages-displaying-on-spark-console/43747948#43747948? – Rahul Sharma Sep 20 '18 at 14:59
  • Yes, but it's not what I need. I want to separate my logs from the spark ones, this post talk about log level of Spark logging. I'm wondering if I have to mix up with my stuff the content of the Spark log4j configuration file, I really hope not. – Stefania Sep 20 '18 at 15:24
  • 1
    You can create a separate logging appender then assign logger to appender. You can specify logger package, log level etc. – Rahul Sharma Sep 21 '18 at 06:00
  • ... we had the jar hell (a child of the old dll hell). And our years are the years of the logback hell. Every six ou seven modules you add for dependencies in your POM, you have to remove this component because it causes troubles... – Marc Le Bihan Sep 21 '18 at 11:42

1 Answers1

2

Please create the different category for custom logging in log4j properties file.

log4j.appender.customLog=org.apache.log4j.FileAppender
log4j.appender.customLog.File=/home/user/TEMP/Spark/sparkapp.log
log4j.appender.customLog.layout=org.apache.log4j.PatternLayout
log4j.appender.customLog.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSS} %-5p %c{1}:%L - %m%n

log4j.category.customLog=INFO, debugLog
log4j.additivity.customLog=false

In application code, configure the logger as follows

static final Logger customLog = Logger.getLogger("customLog");
customLog.info("Test msg")

Make sure that the customized log4j properties is set in extraJava options.

spark.executor.extraJavaOptions=-Dlog4j.configuration=/home/hadoop/spark-conf/log4j.properties

spark.driver.extraJavaOptions=-Dlog4j.configuration=/home/hadoop/spark-conf/log4j.properties
Ravikumar
  • 1,121
  • 1
  • 12
  • 23