I have followed some tutorial online but they do not work with Spark 1.5.1
on OS X El Capitan (10.11)
Basically I have run this commands download apache-spark
brew update
brew install scala
brew install apache-spark
updated the .bash_profile
# For a ipython notebook and pyspark integration
if which pyspark > /dev/null; then
export SPARK_HOME="/usr/local/Cellar/apache-spark/1.5.1/libexec/"
export PYSPARK_SUBMIT_ARGS="--master local[2]"
fi
run
ipython profile create pyspark
created a startup file ~/.ipython/profile_pyspark/startup/00-pyspark-setup.py
configured in this way
# Configure the necessary Spark environment
import os
import sys
# Spark home
spark_home = os.environ.get("SPARK_HOME")
# If Spark V1.4.x is detected, then add ' pyspark-shell' to
# the end of the 'PYSPARK_SUBMIT_ARGS' environment variable
spark_release_file = spark_home + "/RELEASE"
if os.path.exists(spark_release_file) and "Spark 1.4" in open(spark_release_file).read():
pyspark_submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS", "")
if not "pyspark-shell" in pyspark_submit_args: pyspark_submit_args += " pyspark-shell"
os.environ["PYSPARK_SUBMIT_ARGS"] = pyspark_submit_args
# Add the spark python sub-directory to the path
sys.path.insert(0, spark_home + "/python")
# Add the py4j to the path.
# You may need to change the version number to match your install
sys.path.insert(0, os.path.join(spark_home, "python/lib/py4j-0.8.2.1-src.zip"))
# Initialize PySpark to predefine the SparkContext variable 'sc'
execfile(os.path.join(spark_home, "python/pyspark/shell.py"))
I then run ipython notebook --profile=pyspark
and the notebook works fine, but the sc
(spark context) is not recognised.
Anyone managed to do this with Spark 1.5.1
?
EDIT: you can follow this guide to have it working