Whenever I do a dse spark-submit <jarname>
,it copies the jar in SPARK_WORKER_DIR
(in my case /var/lib/spark-worker/worker-0). I want to get the jar automatically deleted once the spark job is successfully completed/run. Using this, I changed my SPARK_WORKER_OPTS
in spark-env.sh
as follows :
export SPARK_WORKER_OPTS="$SPARK_WORKER_OPTS -Dspark.worker.cleanup.enabled=true -Dspark.worker.cleanup.interval=1800"
But the jar is still not getting deleted. Am I doing something wrong? What should I do?