I am using a Spark Databricks cluster and want to add a customized Spark configuration.
There is a Databricks documentation on this but I am not getting any clue how and what changes I should make. Can someone pls share the example to configure the Databricks cluster.
Is there any way to see the default configuration for Spark in the Databricks cluster.
Asked
Active
Viewed 1.4k times
7

Stark
- 604
- 3
- 11
- 30
-
I have yet to see any documentation of the databrick specific config options. Hopefully someone can chime in with that documentation. – Foxhound013 Jan 25 '23 at 15:16
2 Answers
0
You have many ways to set up the default cluster configs:
Manually in the "compute" tab (as mentioned before): Go to Compute > Select a cluster > Advanced Options > Spark
Via notebook (as mentioned before): In a cell of your databricks notebook, you can set any spark configuration for that session/job by running the "spark.conf.set" command like
spark.conf.set("spark.executor.memory","4g")
Using JOB CLI API: If you are aiming to deploy jobs programmatically in a multi-environment fashion (e.g. Dev, Staging, Production):
Useful links!

Leonardo Pedroso
- 46
- 4