33

How can I ship C compiled modules (for example, python-Levenshtein) to each node in a Spark cluster?

I know that I can ship Python files in Spark using a standalone Python script (example code below):

from pyspark import SparkContext
sc = SparkContext("local", "App Name", pyFiles=['MyFile.py', 'MyOtherFile.py'])

But in situations where there is no '.py', how do I ship the module?

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
mgoldwasser
  • 14,558
  • 15
  • 79
  • 103
  • Related to [Easiest way to install Python dependencies on Spark executor nodes?](https://stackoverflow.com/questions/29495435/easiest-way-to-install-python-dependencies-on-spark-executor-nodes) – zero323 May 02 '18 at 16:44

2 Answers2

49

If you can package your module into a .egg or .zip file, you should be able to list it in pyFiles when constructing your SparkContext (or you can add it later through sc.addPyFile).

For Python libraries that use setuptools, you can run python setup.py bdist_egg to build an egg distribution.

Another option is to install the library cluster-wide, either by using pip/easy_install on each machine or by sharing a Python installation over a cluster-wide filesystem (like NFS).

EmmaOnThursday
  • 167
  • 4
  • 9
Josh Rosen
  • 13,511
  • 6
  • 58
  • 70