7

I'm trying to install spark on windows 10 but I have an error while I try to run pyspark.

Failed to find Spark jars directory. You need to build Spark before running this program.

I've followed the steps indicated Here until the step 4.

I went to Anaconda's Scripts and site-packages. In Scripts there are pyspark spark-shell and so on, BUT the pyspark folder at site-packages doesn't have neither the jars folder or its own bin folder.

Where are the jars?

CDspace
  • 2,639
  • 18
  • 30
  • 36
marin lb
  • 71
  • 1
  • 2
  • Did you set your environment variables correctly? SPARK_HOME, HADOOP_HOME as your spark installation directory and add $SPARK_HOME\bin to PATH – MaFF Nov 15 '17 at 21:21
  • Hi MaFF, I set my environment variables, and they indicates the bin folder that contains each program. – marin lb Nov 19 '17 at 11:58

1 Answers1

-2

One of the causes of this error is that while settings the path Spark_home path may contain spaces like "C:/Users//apache spark/bin"

Just remove the spaces or change to another directory.

VinayKumar.M
  • 147
  • 1
  • 4