0

i am installing spark on my machine- Window server 2008r2. I have followed the below step.

  1. Install JDK

  2. Add to Environmental Variable

  3. Add winutil as Hadoop_Home in Env. Variable
  4. Download Spark and configured env variable as SPARK_HOME

i have not installed scala.

when i tried to call spark from command line getting below error.

enter image description here

enter image description here

enter image description here

Please help me how can i fix this. i have admin account on this server, and i set the path in system variables.

Sophie Dinka
  • 73
  • 1
  • 8

1 Answers1

0

The issue could be because of the \bin after the JAVA_HOME path. Spark has issues with that. Remove the \bin after JAVA_HOME and run the spark-shell command. It should work.

  • Thanks a lot adrian, i did like %JAVA_HOME%;%SPARK_HOME%/bin.... but still the same issue. i am doing in system properties. kindly help me if i am doing something wrong – Sophie Dinka Sep 14 '19 at 03:55
  • any suggestion please. – Sophie Dinka Sep 15 '19 at 08:04
  • Are you getting the same error as above? Can you run the spark-shell command from the /spark/bin folder instead of the /spark folder and check? – adrian texeira Sep 15 '19 at 10:45
  • i did like this. cd spark/bin then spark-shell but still getting error as the system cannot find the path specified. i have configured all env.variable under system variable, is this the reason, not sure – Sophie Dinka Sep 15 '19 at 13:08
  • This is how my env. variable are : %JAVA_HOME%; %SPARK_HOME%\bin. Kindly suggest if i need to do any changes please – Sophie Dinka Sep 15 '19 at 13:17
  • Have you installed **winutils** package? Also, can you refer this link, not sure if you seen this, https://stackoverflow.com/questions/33734210/system-cannot-find-the-path-specified-in-spark-shell – adrian texeira Sep 16 '19 at 18:01