2

I am new to PYCHARM. I have configured everything properly but still facing below issue, when i try to execute simple code
Could not find valid SPARK_HOME while searching:

In project interpreters i have updated the paths for PYTHON, PY4J.zip, PYSPARK.zip and SPARK till bin (bin not included)

versions are as below:

Python version - 3.6.4
Spark - 2.2.1

Do I need to configure anything?

ascripter
  • 5,665
  • 12
  • 45
  • 68

1 Answers1

2

I am not sure how you have configured. So you can use below code for configuration.

Here i have used winutils path for HADOOP_HOME, If you are using Hadoop please mention proper hadoop_home path.

import sys
import os
os.environ['SPARK_HOME'] = "C:/Users/LZ/Spark"
os.environ['HADOOP_HOME'] = "C:/Users/LZ/winutils"
sys.path.append("C:/Users/LZ/Spark/python")
sys.path.append("C:/Users/LZ/Spark/python/lib")

After implementing above code you may able to run your code in PYCHARM. Hope this helps

LUZO
  • 1,019
  • 4
  • 19
  • 42
  • Its working fine. But i have to add this code for every `.Py` file right? –  Jan 17 '18 at 10:45
  • @Zo1o: Yes, You need to add – LUZO Jan 17 '18 at 10:47
  • Previously i used to work in eclipse ide. In that i just have to specify Spark_Home,Hadoop_Home and Python paths. Cant we do the same in PYCHARM? –  Jan 17 '18 at 10:50
  • [please reffer this](https://stackoverflow.com/questions/34685905/how-to-link-pycharm-with-pyspark?rq=1) – LUZO Jan 17 '18 at 10:58