0

I have a codebase written in Scala that uses Spark. I'd like the method foo to be called externally from Python code.

def foo(a: Int, b: String): String

I saw here Java Python Integration that I can use Jython. It think that's an overkill though.

Can't I add a PySpark method that wraps the Scala/Spark existing method?

If not, isn't there a simpler solution, where I wouldn't need a special module like Jython?

Community
  • 1
  • 1
bsky
  • 19,326
  • 49
  • 155
  • 270
  • Method to do what exactly? Or if you prefer where? – zero323 Jul 04 '16 at 15:23
  • 1
    Possible duplicate of [How to use Java/Scala function from an action or a transformation?](http://stackoverflow.com/questions/31684842/how-to-use-java-scala-function-from-an-action-or-a-transformation) – zero323 Jul 04 '16 at 15:24

0 Answers0