2

During execution of following simple code with Sparkit-Learn:

from splearn.svm import SparkLinearSVC
spark=SparkLinearSVC()

I get following error message:

ImportError: pyspark home needs to be added to PYTHONPATH.
export PYTHONPATH=$PYTHONPATH:$SPARK_HOME/python:../

In accordance with those anserws: unable to add spark to PYTHONPATH importing pyspark in python shell I have added every possible configuration of those PYTHONPATHs to my .bashrc,but error is still occuring.

Currently my .bashrc paths looks like that:

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export PATH=$JAVA_HOME/bin:$PATH
export PATH=/home/123/anaconda2/bin:$PATH
export SPARK_HOME=/home/123/Downloads/spark-1.6.1-bin-hadoop2.6
export PATH=$SPARK_HOME/bin:$PATH
export PATH=$JAVA_HOME/jre/lib/amd64/server:$PATH
export PATH=$JAVA_HOME/jre/lib/amd64:$PATH
export PYTHONPATH=$SPARK_HOME/python/:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.9-src.zip:$PYTHONPATH

Any possible solution?

Community
  • 1
  • 1
mokebe
  • 77
  • 1
  • 7

0 Answers0