I am trying to find a way to execute stored sql expression in a column using pyspark. Here is the sample below.
col1 | col2 | rules |
---|---|---|
rule1 | test | when col1 is xxx then 'hello' when col2 is 'yyy' then 'world' |
rule2 | test1 | when col2 is 'abc' then 'foo' when col1 is null then 'bar' |
after executing the rules column, I want the output to like below.
col1 | col2 | rules | result |
---|---|---|---|
rule1 | test | when col1 is 'rule1' then 'hello' when col2 is 'yyy' then 'world' | hello |
rule2 | test1 | when col2 is 'abc' then 'foo' when col1 is null then 'bar' | null |
I tried selectExpr but it's not working.. any help will be appreciated