After installing spark following this. https://www.davidadrian.cc/posts/2017/08/how-to-spark-cluster/
I got this message :
n@jupyter:~$ spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
java.net.UnknownHostException: jupyter: jupyter: Name or service not known
at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:891)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:884)
at e(Utils.scala:941)
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.2.0
/_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_112-release)
scala> 1+1
res0: Int = 2
I am not sure why spark is looking for jupyter class, since this is only shell launching.
EDIT : Add bashrc config.
export PATH=/home/noel/pycharm/jre/bin:$PATH
export HADOOP_HOME=/home/xxx/hadoop_275
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native:$LD_LIBRARY_PATH
export SPARK_HOME=/home/xxx/spark/spark22_hadoop27
export PATH=$SPARK_HOME/bin:$PATH
etc/host :
127.0.0.1 localhost
#127.0.1.1 deep-learning
# The following lines are desirable for IPv6 capable hosts
::1 ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters