I wanted to evaluate two conditions in when like this :-
import pyspark.sql.functions as F
df = df.withColumn(
'trueVal', F.when(df.value < 1 OR df.value2 == 'false' , 0 ).otherwise(df.value))
For this I get 'invalid syntax' for using 'OR'
Even I tried using nested when statements :-
df = df.withColumn(
'v',
F.when(df.value < 1,(F.when( df.value =1,0).otherwise(df.value))).otherwise(df.value)
)
For this i get 'keyword can't be an expression'
for nested when statements.
How could I use multiple conditions in when
any work around ?