I'm submitting my jobs to a spark-cluster (with YARN) programmatically with a Java-app and the Spark Launcher (starting the job with startApplication(), not launch()). I like to have all the log-output, which is produced on stdout und stderr by the launcher when executing the Java app, in a file, which I can access with the java-app. I don't want to change the global spark-log-config, I want a dynamic solution, which I can control depending on changing variables from the java-app on every single execution.
Following the documentation this should be possible by using the CHILD_PROCESS_LOGGER_NAME option. So I defined a java.util.logging.logger like here and added this code to my job-launcher:
SparkLauncher.setConfig(SparkLauncher.CHILD_PROCESS_LOGGER_NAME, "MyLog");
But this doesn't work, logfile is empty. I also tried the other methods like setConf(...) or add addSparkArg(...), without success. What did I wrong? Or should I better use log4j, make a custom configuration, and give it in any way to the launcher? If yes, how to do this in my java-app?