2

I am submitting jobs to a cluster that is shared among many people, and would like to change some logging configurations. Since it is shared by many people, changing the log level on the cluster is obviously a non-option, so I am attempting to submit my log.properties file via spark-submit.

Per the documentation: upload a custom log4j.properties using spark-submit, by adding it to the --files list of files to be uploaded with the application.

I did that, however, the workers are still using the the default log4j properties, per log line: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties. However, I do see that the log4j.properties file is properly copied over to the worker machines at /data/spark/work-dir/<app-id>/<worker-id>/log4j.properties. How do I make the spark workers actually use this log4j.properties file?

For reference:

  • Spark 1.5.1
  • Default deploy mode(client)
  • I also tried with the setting the path on `park.executor.extraJavaOptions=-Dlog4j.configuration, but no luck there
  • Logging is fine on the driver, it appears to be just the workers that I am having trouble with.

Similar post: How to override Spark's log4j.properties per driver?. However I have no log4j.properties file under the conf/, which appears to be the solution there.

Community
  • 1
  • 1
Brent Hronik
  • 2,357
  • 1
  • 27
  • 43

0 Answers0