While defining the multiple logical/relational condition in spark scala dataframe getting the error as mentioned below. But same thing is working fine in scala
Python code:
df2=df1.where(((col('a')==col('b')) & (abs(col('c')) <= 1))
| ((col('a')==col('fin')) & ((col('b') <= 3) & (col('c') > 1)) & (col('d') <= 500))
| ((col('a')==col('b')) & ((col('c') <= 15) & (col('c') > 3)) & (col('d') <= 200))
| ((col('a')==col('b')) & ((col('c') <= 30) & (col('c') > 15)) & (col('c') <= 100)))
Tried for scala equivalent:
val df_aqua_xentry_dtb_match=df_aqua_xentry.where((col("a") eq col("b")) & (abs(col("c") ) <= 1))
notebook:2: error: type mismatch;
found : org.apache.spark.sql.Column
required: Boolean
val df_aqua_xentry_dtb_match=df_aqua_xentry.where((col("a") eq col("b")) & (abs(col("c") ) <= 1))
How to define multiple logical condition in spark dataframe using scala