I have used Spark in Scala for a long time. Now I am using pyspark for the first time. This is on a Mac
- First I installed pyspark using conda install pyspark, and it installed pyspark 2.2.0
- I installed spark itself using brew install apache-spark, and it seems to have installed apache-spark 2.2.0
but when I run pyspark, it dumps out
/Users/me/anaconda/bin/pyspark: line 24: /Users/bruceho/spark-1.6/spark-1.6.2-bin-hadoop2.6/bin/load-spark-env.sh: No such file or directory
/Users/me/anaconda/bin/pyspark: line 77: /Users/bruceho/spark-1.6/spark-1.6.2-bin-hadoop2.6/bin/spark-submit: No such file or directory
/Users/me/anaconda/bin/pyspark: line 77: exec: /Users/bruceho/spark-1.6/spark-1.6.2-bin-hadoop2.6/bin/spark-submit: cannot execute: No such file
Why is it pointing to the 1.6.2 installation, which seems to be no longer there? Brew search apache-spark does indicate the presence of both 1.5. and 1.6. Shouldn't pyspark 2.2.0 automatically point to the apache-spark 2.2.0 installation?