2

I export my databricks workspace directory (/Users/xyz/) contents which has several python notebooks and scripts onto a databricks specific location for e.g. /dbfs/tmp and then try to call the following code to run a python notebook named xyz.py from the exported location as follows:

dbutils.notebook.run("/dbfs/tmp/xyz", timeout_seconds=1200)

OR

dbutils.notebook.run("dbfs:/tmp/xyz", timeout_seconds=1200)

OR

dbutils.notebook.run("../../tmp/xyz", timeout_seconds=1200)

dbutils always does not able to find the notebook path and gives following exception:

com.databricks.WorkflowException: com.databricks.NotebookExecutionException: Unknown state: Notebook not found: /dbfs:/tmp/xyz

Though if I check the same dbfs path for the notebook existence then I can see the notebook has been placed.

How can I run the dbutils.notebook.run statement with a specific location from databricks?

1 Answers1

1

This is an excepted behavior with the dbutils.notebook.runs. When you specify the location other than "/Users/abc@org.com/notebookname" it throws the following error message com.databricks.WorkflowException: com.databricks.NotebookExecutionException: Unknown state: Notebook not found: /tmp/mount.dbc.

I would suggest you to place all the notebooks under users and then you can call the notebooks using dbuitls.notebook.run("/Users/abc@org.com/NotebookName").

enter image description here

CHEEKATLAPRADEEP
  • 12,191
  • 1
  • 19
  • 42
  • 2
    Hi @CHEEKATLAPRADEEP-MSFT Thank you. Does that mean that dbutls.notebook.run cannot execute any python script/ notebook from any dbfs location except from your user workspace? because I have a requirement to clone git repos on a DBFS location and then execute notebooks from the cloned location. – Pankaj Verma Jun 09 '20 at 11:58