0

I have installed Spark sucessfully and am able to launch Spark from my terminal through "spark-shell" command, but i can't get instantiate a SparkContext from jupyter notebook and am getting this error:

OSError: [Errno 2] No such file or directory

Here is my code in the jupyter notebook:

from pyspark import SparkContext
sc = SparkContext()

Here is my ~/.bash_profile:

export PATH="/Users/NAME/anaconda2/bin:$PATH"
PYSPARK_DRIVER_PYTHON_OPTS="notebook" pyspark'
export SPARK_HOME='/usr/local/spark'
export PATH=$SPARK_HOME:$PATH
export PYTHONPATH=$SPARK_HOME/python:$PYTHONPATH
doyz
  • 887
  • 2
  • 18
  • 43
  • Is the dir export SPARK_HOME='/usr/local/spark' located where you have installed jupyter, please check – KrazyGautam Feb 09 '18 at 12:38
  • @KrazyGautam no, my jupyter is installed in: /Users/NAME/anaconda2/bin/jupyter – doyz Feb 09 '18 at 13:41
  • @KrazyGautam Hey, i resolved it by typing PYSPARK_DRIVER_PYTHON="jupyter" PYSPARK_DRIVER_PYTHON_OPTS="notebook" pyspark in the command line. This article helped: https://www.datacamp.com/community/tutorials/apache-spark-python. After that, there actually isn't a need to create a Spark Context again – doyz Feb 09 '18 at 14:53

0 Answers0