My program contains quite a few jar files, which are copied to work directory for each executor of the app. These directories are located at $SPARK_HOME/work. These directories hold program's libraries and logs (stdout and stderr). Note that I am not talking about Spark's tmp directories here, as they are something else.
As these directories can get quite big, I want to remove these directories as soon as my program is done. One way obviously is to write some script yourself to do that, but is there a way I can command Spark to do it for me, that is delete them as soon as a program is finished?