When I restart spark cluster all of history of completed application in web ui are deleted. How can I preserve this history from deleting when restarting?
Asked
Active
Viewed 231 times
2 Answers
1
Spark itself doesn't store logs. If you want to store them then you need to enable that config by using "spark.eventLog
":
./bin/spark-submit --class org.apache.spark.examples.SparkPi \
--master spark://10.129.6.11:7077 \
--conf spark.eventLog.enabled=true \
--conf spark.eventLog.dir="hdfs://your path" \
/home/spark/spark-3.2.1-bin-hadoop3.2/examples/jars/spark-examples_2.12-3.2.1.jar 8

Utkarsh I.
- 11
- 1
0
Don't restart spark master. Just make it got query like Zeppelin.

Juhong Jung
- 101
- 1
- 7
-
Suppose the server crashes or turns off... . what should I do now? – DAVID_ROA Oct 22 '18 at 06:38
-
@DAVID_ROA Usually spark history log is stored on HDFS. Can you find that? – Juhong Jung Oct 22 '18 at 07:56
-
No, I think I should enable a property or set a config, but I don't know what it is. – DAVID_ROA Oct 22 '18 at 08:15