1

When I restart spark cluster all of history of completed application in web ui are deleted. How can I preserve this history from deleting when restarting?

DAVID_ROA
  • 309
  • 1
  • 3
  • 18

2 Answers2

1

Spark itself doesn't store logs. If you want to store them then you need to enable that config by using "spark.eventLog":

./bin/spark-submit --class org.apache.spark.examples.SparkPi \
--master spark://10.129.6.11:7077 \
--conf spark.eventLog.enabled=true \
--conf spark.eventLog.dir="hdfs://your path" \
/home/spark/spark-3.2.1-bin-hadoop3.2/examples/jars/spark-examples_2.12-3.2.1.jar 8
Utkarsh I.
  • 11
  • 1
0

Don't restart spark master. Just make it got query like Zeppelin.

Juhong Jung
  • 101
  • 1
  • 7