Is there a way to compare two values of type double
in PySpark, with a specified margin of error?
Essential similar to this post, but in PySpark.
Something like:
df=#some dataframe with 2 columns RESULT1 and RESULT2
df=withColumn('compare', when(col('RESULT1')==col('RESULT2') +/- 0.05*col('RESULT2'), lit("match")).otherwise(lit("no match"))
But in a more elegant way?