4

We could use some help on how to send Spark Driver and worker logs to a destination outside Azure Databricks, like e.g. Azure Blob storage or Elastic search using Eleastic-beats.

When configuring a new cluster, the only options on get reg log delivery destination is dbfs, see

https://docs.azuredatabricks.net/user-guide/clusters/log-delivery.html.

Any input much appreciated, thanks!

Jean Vache
  • 83
  • 1
  • 10

1 Answers1

7

Maybe the following could be helpful :

First you specify a dbfs location for your Spark driver and worker logs.
https://docs.databricks.com/user-guide/clusters/log-delivery.html

Then, you create a mount point that links your dbfs folder to a Blob Storage container. https://docs.databricks.com/spark/latest/data-sources/azure/azure-storage.html#mount-azure-blob-storage-containers-with-dbfs

Hope this help !

Joseph M. Dion
  • 356
  • 3
  • 7
  • Thank you ! got it working when I had to specify it with `/mnt` in path e.g. `dbfs:/mnt/databricks-cluster-logs`. Specifying just `dbfs:/databricks-cluster-logs` didn't work. – Shrikant Prabhu Oct 29 '21 at 22:36