I have a codebase written in Scala that uses Spark. I'd like the method foo
to be called externally from Python code.
def foo(a: Int, b: String): String
I saw here Java Python Integration that I can use Jython. It think that's an overkill though.
Can't I add a PySpark method that wraps the Scala/Spark existing method?
If not, isn't there a simpler solution, where I wouldn't need a special module like Jython?