1

I'm trying to install com.crealytics.spark.excel package in databricks. is there command line way to install it without going from cluster, library and install new??

CHEEKATLAPRADEEP
  • 12,191
  • 1
  • 19
  • 42
ganpat
  • 41
  • 1
  • 5
  • If my answer is helpful for you, you can accept it as answer( click on the check mark beside the answer to toggle it from greyed out to filled in.). This can be beneficial to other community members. Thank you. – – CHEEKATLAPRADEEP Oct 22 '20 at 09:27

1 Answers1

1

You can use Databricks CLI to install 'com.crealytics.spark.excel' in Databricks.

Syntax: databricks libraries install --cluster-id "Cluster ID" --maven-coordinates "GroupId:ArtifactId:Version" (i.e.org.jsoup:jsoup:1.7.2)

Step1: From maven coordinates, you can go to Maven Repository and pick the version which you are looking for.

enter image description here

Step2: Use the below Databricks CLI command to install 'com.crealytics.spark.excel' in Databricks.

databricks libraries install --cluster-id "0925-XXXXXX-bite618" --maven-coordinates "com.crealytics:spark-excel_2.12:0.13.5"

enter image description here

For different methods to install packages in Azure Databricks, refer: How to install a library on a databricks cluster using some command in the notebook?

CHEEKATLAPRADEEP
  • 12,191
  • 1
  • 19
  • 42
  • If it is useful for you, could you please [accept it as an answer](https://meta.stackexchange.com/questions/5234/how-does-accepting-an-answer-work)? It may help more people who have similar issue. – CHEEKATLAPRADEEP Nov 06 '20 at 04:19