I am using Azure databricks with LTS 7.3 and spark 3.0 (PySpark) with com.microsoft.azure.kusto:kusto-spark_3.0_2.12:2.9.1 connector for quite sometime now but recently my jobs are failing with below errors (randomly, sometimes they run and othertimes they just simply fail)
df = pyKusto.read \
.format("com.microsoft.kusto.spark.datasource") \
.option("kustoCluster", kustoOptions["kustoCluster"]) \
.option("kustoDatabase", kustoOptions["kustoDatabase"]) \
.option("kustoQuery", Query) \
.option("kustoAadAppId", kustoOptions["kustoAadAppId"]) \
.option("kustoAadAppSecret", kustoOptions["kustoAadAppSecret"]) \
.option("kustoAadAuthorityID", kustoOptions["kustoAadAuthorityID"]) \
.load()
java.lang.ClassNotFoundException: Failed to find data source: com.microsoft.kusto.spark.datasource. Please find packages at http://spark.apache.org/third-party-projects.html
I have already installed the library on the cluster and it has been running for sometime without issues but not sure what's happening to it recently. Please suggest any workaround if anyone have seen this issue?
Thanks