I have Azure Databrick notebook which contain SQL command. I need to capture output of SQL command and use in Dot Net core. Need help.
Asked
Active
Viewed 1,814 times
1 Answers
1
You cannot capture results of Azure Databricks Notebook directly in Dot Net Core. Also, there are no .NET SDK's available and so you need to rely on Databricks REST API's from your .NET code for all your operations. You could try the following -
- Update your Notebook to export result of your SQL Query as CSV file to file store using df.write. For example -
df.write.format("com.databricks.spark.csv").option("header","true").save("sqlResults.csv")
- You can setup a Job with the above Notebook and then you can invoke the job using Jobs API - run-now in .NET
- You need to poll the job status using the runs list method to check the job completion state from your .NET code.
- Once the job is completed, you need to use the DBFS API - Read to read the content of the csv file your notebook has generated in step 1.

SaurabhSharma
- 898
- 1
- 5
- 9