I don't have much experience in PySpark
I need to check in Spark Dataframe how many values are greater then certain threshold(absolute) in a row. I tried this and it doesn't work
n = lit(len(df.columns))
rank= (reduce(add, (1 for x in df.columns[1:] if abs(col(x)) > threshold))).alias(rank)