Disclaimer
I do not know much of python, so the question describes "how it looks like" and the answer should be "how it actual works".
Question
Pyspark
allows to run python code in spark
. But python
is interpreted language and it functionality depend on environemnt (e.g. 32 or 64 bit platform you run python code). While spark
runs on jvm which run code independ on environemnt.
So how does python
code "converted" into jvm byte code? Or it is not run on jvm? What technology is used? (CORBA?) I heard about Jython
but it looks like independe technology which is not used in pysaprk
is it?