1

I have Apache Spark installed in Ubuntu and I recently installed IPython notebook. It was all running fine until I restarted my laptop and now when I run with following command:

IPYTHON_OPTS="notebook" $SPARK_HOME/bin/pyspark

it tries to search wrong path for Java and gives following error:

/home/username/spark-1.5.2-bin-hadoop2.6/bin/spark-class: line 77: /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java/bin/java: Not a directory

Obviously it is looking for wrong JAVA home path since it has extra "/bin/java" appended to it. Echoing JAVA_HOME gives correct path:

echo $JAVA_HOME
/usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java

Help me resolve this.

Rajat
  • 313
  • 1
  • 8
  • 19
  • 1
    Your JAVA_HOME is set incorrectly, it should stop at the root of the java installation which is `/usr/lib/jvm/java-7-openjdk-amd64/jre`. If you have `java_home` (e.g. `/usr/libexec/java_home`) you can do `export JAVA_HOME=$(/usr/libexec/java_home)` to set `$JAVA_HOME` correctly. – AChampion Jan 09 '16 at 02:48
  • Thanks. After following steps in this post, it worked. Actually it should be just "/usr/lib/jvm/java-7-openjdk-amd64".http://stackoverflow.com/questions/24641536/how-to-set-java-home-in-linux-for-all-users – Rajat Jan 09 '16 at 05:00
  • Try this [Installing Jupyter for Apache Spark](http://stackoverflow.com/questions/33064031/link-spark-with-ipython-notebook/33065359#33065359) – Alberto Bonsanto Jan 09 '16 at 12:47

0 Answers0