5

I am running jobs on databricks clusters. When the cluster is running I am able to find the executor logs by going to Spark Cluster UI Master dropdown, selecting a worker and going through the stderr logs. However, once the job is finished and cluster terminates, I am unable to see those logs. I get below screen

databricks cluster after completion

I am unable to access the spark UI (last tab). Is there any way I can get the executor logs after cluster is terminated just like we can download the driver logs?

Tusharjain93
  • 101
  • 1
  • 8

2 Answers2

2

Hope this will help you --- More details here

  1. Click on Jobs

  2. Click the job you want to see logs for

  3. Click "Logs". This will show you driver logs.

For executor logs, the process is a bit more involved:

  1. Click on Clusters

  2. Choose the cluster in the list corresponding to the job

  3. Click Spark UI

  4. Now you have to choose the worker for which you want to see logs. Click the nodes list (it's on the far right, next to "Apps") and then you can click stdout or stderr to see the logs

dsk
  • 1,863
  • 2
  • 10
  • 13
  • 2
    I saw that link.. this works while the job is running. Also it shows snapshot of logs. I want to download entire executor logs after the job is finished. just like driver logs are available. – Tusharjain93 Aug 17 '20 at 12:20
  • @Tusharjain93 did you ever manage to do that? Maybe even programmatically? I am struggling with the same problem... – Thomas Sep 27 '21 at 12:04
  • 1
    No Thomas, there is no way we can extract worker nodes logs – Tusharjain93 Sep 28 '21 at 13:05
1

You can configure your cluster's log delivery location.

After that, find executor logs by path {log_delivery_location}/{cluster_id}/executor/.

Find cluster_id in the URL of the sparkui. To read log files you can download them by coping into dbfs:/FileStore/ and using the answer.

axreldable
  • 134
  • 1
  • 10