I am running jobs on databricks clusters. When the cluster is running I am able to find the executor logs by going to Spark Cluster UI Master dropdown, selecting a worker and going through the stderr logs. However, once the job is finished and cluster terminates, I am unable to see those logs. I get below screen
I am unable to access the spark UI (last tab). Is there any way I can get the executor logs after cluster is terminated just like we can download the driver logs?