I have installed Spark sucessfully and am able to launch Spark from my terminal through "spark-shell" command, but i can't get instantiate a SparkContext from jupyter notebook and am getting this error:
OSError: [Errno 2] No such file or directory
Here is my code in the jupyter notebook:
from pyspark import SparkContext
sc = SparkContext()
Here is my ~/.bash_profile:
export PATH="/Users/NAME/anaconda2/bin:$PATH"
PYSPARK_DRIVER_PYTHON_OPTS="notebook" pyspark'
export SPARK_HOME='/usr/local/spark'
export PATH=$SPARK_HOME:$PATH
export PYTHONPATH=$SPARK_HOME/python:$PYTHONPATH