I have a PySpark DataFrame named DF with (K,V) pairs. I would like to apply multiple functions with ReduceByKey. For example, I have following three simple functions:
def sumFunc(a,b): return a+b
def maxFunc(a,b): return max(a,b)
def minFunc(a,b): return min(a,b)
When I apply only one function, e.g,, following three work:
DF.reduceByKey(sumFunc) #works
DF.reduceByKey(maxFunc) #works
DF.reduceByKey(minFunc) #works
But, when I apply more than one function, it does not work, e.g., followings do not work.
DF.reduceByKey(sumFunc, maxfunc, minFunc) #it does not work
DF.reduceByKey(sumFunc, maxfunc) #it does not work
DF.reduceByKey(maxfunc, minFunc) #it does not work
DF.reduceByKey(sumFunc, minFunc) #it does not work
I do not want to use groupByKey
because it slows down the computation.