9

How to pass a python variables from %python cmd to shell script %sh,in azure databricks notebook..?

notebook example

Peter Pan
  • 23,476
  • 4
  • 25
  • 43
Tony
  • 142
  • 2
  • 8

1 Answers1

16

Per my experience, there are two workaround ways to pass a Python variable to Bash script for your current scenario.

Here is my sample codes using Python3 in notebook.

  1. To pass little data via environment variable in the same shell session of Azure Databricks Notebook, as below.

    %python
    import os
    l = ['A', 'B', 'C', 'D']
    os.environ['LIST'] = ' '.join(l)
    print(os.getenv('LIST'))
    
    %%bash
    for i in $LIST
    do
      echo $i
    done
    

It works as the figure below.

enter image description here

  1. To pass large data via file in the current path of Azure Databricks Notebook, as below.

    %python
    with open('varL.txt', 'w') as f:
      for elem in l:
        f.write(elem+'\n')
      f.close()  
    
    %%bash
    pwd
    ls -lh varL.txt
    echo '=======show content=========='
    cat varL.txt
    echo '=====result of script========'
    for i in $(cat varL.txt)
    do
      echo $i
    done
    

It works as the figure below.

enter image description here

Peter Pan
  • 23,476
  • 4
  • 25
  • 43
  • Thanks @Peter for your explanation! – Tony Feb 13 '19 at 08:50
  • How would one do this if you didn't want to hardcode the name of the environment variable? In cmd 2 in the first way, you hardcode the name LIST in $LIST. Is there anyway that the name for LIST could be stored in a python variable and evaluated within the bash script? Like `x = 'LIST'` and you do something like `for i in $x` in the bash script. In my experience, this doesn't work. – dzubke Nov 10 '22 at 17:17