0

I know hadoop uses log4 for logging and I was able to view the properties in conf/log4j.properties to figure out where the existing log files are. However, is there a way for me to direct the logs from a hbase mapreduce job to a completely new file? Idea being, I have a job scheduled to run nightly and I like to be able to log to /var/log/myjob.log for just this job so that I can check that file for any errors/exceptions instead of having to go through the jotrackter UI. Is this possible? If so, how? Also, note that the job will be submitted to the cluster so please advice if the log file needs to be on HDFS or the regular (linux) platform. If on Linux then should it be on all nodes or just the hadoop master?

Thanks for suggestions.

chapstick
  • 713
  • 6
  • 16
  • 25

1 Answers1

0

You can dynamically create a File Appender as explained at Configuring Log4j Loggers Programmatically in the setup() method of mapper and reducer classes .

Community
  • 1
  • 1
Magham Ravi
  • 603
  • 4
  • 8
  • Ravi thanks for the suggestion. But I am not sure if I can do that from the mapreduce job since it is running in a distributed fashion. – chapstick Jun 25 '13 at 12:32