7

I have a python notebook A in Azure Databricks having import statement as below:

import xyz, datetime, ...

I have another notebook xyz being imported in notebook A as shown in above code. When I run notebook A, it throws the following error:

ImportError: No module named xyz  

Both notebooks are in the same workspace directory. Can anyone help in resolving this?

user39602
  • 339
  • 2
  • 5
  • 13

2 Answers2

9

The only way to import notebooks is by using the run command:

%run /Shared/MyNotebook

or relative path:

%run ./MyNotebook

More details: https://docs.azuredatabricks.net/user-guide/notebooks/notebook-workflows.html

simon_dmorias
  • 2,343
  • 3
  • 19
  • 33
  • 4
    Thanks Simon, is this still the best way to import modules/functions from other notebooks into databricks? – Umar.H Nov 21 '19 at 11:21
2

To get the result back as a DataFrame from different notebook in Databricks we can do as below.

noebook1

def func1(arg):

    df=df.transfomationlogic
    return df

notbook2

%run path-of-notebook1

df=func1(**dfinput**)

Here the dfinput is a df you are passing and you will get the transformed df back from func1.

Kashyap
  • 15,354
  • 13
  • 64
  • 103