I want to register custom SparkListener with Databricks' spark context.
With basic spark i can just use "spark.jars" and "spark.extraListeners" configs during spark-submit. OR use sparkContext.addSparkListener api.
For databricks setup,I have installed the jar containing listener on my cluster. When I put the config "spark.extraListeners" in "advanced" config tab of the cluster, cluster fails to initialize throwing error Listener not found.
I tried setting it during sparksession builder like
.builder \
.appName("abc") \
.config("spark.extraListeners","mySparkListener") \
.enableHiveSupport() \
.getOrCreate()
databricks wont add it. No errors thrown but listener is not added.
Is there any way to do this? Note: I am using python notebooks on databricks