1

I am trying to run my first spark program but I am stuck up in this .

I am using enthought canopy for python and set my path variable %SPARK_HOME%\ , %JAVA_HOME%\,C:\WINDOWS/system32 . while running spark-submit ratings-counter.py in my canopy command promt ,it is showing the error of spark-submit is not recognized as an internal or external command,operable program or batch file. Anybody can help would be great

Ram Ghadiyaram
  • 28,239
  • 13
  • 95
  • 121

1 Answers1

3

In windows goto command prompt and type set SPARK_HOME

then home directory would be printed, then type the following command line :

%SPARK_HOME%\bin\spark-shell

If it is coming correctly then your configuration is correct. Through Canopy also you can try.

Further...Look at

Tip for Find existing spark configuration from linux prompt :

Find location of your existing spark install and spark configuration being used. This is usually in /etc/spark/conf

readlink -f spark-submit  // windows doesnt have readlink

output would be like

/opt/cloudera/parcels/CDH-5.5.0-1.cdh5.5.0.p0.27

/opt/cloudera/parcels/CDH-5.5.0-1.cdh5.5.0.p0.27/lib/spark/conf -> /etc/spark/conf
Community
  • 1
  • 1
Ram Ghadiyaram
  • 28,239
  • 13
  • 95
  • 121
  • Sir , i did as u said above , spark is running perfectly fine . now how to run an application in enthought canopy . i am doing spark-submit "name of the file" and hitting enter . but it is showing the same error not recognized . – Kashish Khivesara Nov 07 '16 at 15:19
  • pls see my update. there is different way of setting spark home path. – Ram Ghadiyaram Nov 07 '16 at 18:27