I am trying to install apache spark in windows for python. But as i execute C:\Spark\bin\spark-shell, it shows cmd not recognized as internal or external command. How do I fix it? I feel that there is some problem with environment variables but not sure.
Asked
Active
Viewed 443 times
-4
-
Read the answers on [What is the reason for 'sort' is not recognized as an internal or external command, operable program or batch file?](https://stackoverflow.com/questions/41454769/) `cmd` is in real `%SystemRoot%\System32\cmd.exe` and this executable must be found like all other executables specified without file extension and without full path by using the environment variables __PATH__ and __PATHEXT__. I suppose local __PATH__ as set by Windows on starting the process is replaced by something different resulting in not finding `cmd.exe` anymore. – Mofi Jul 13 '17 at 17:27
-
Possible duplicate of [What is the reason for 'sort' is not recognized as an internal or external command, operable program or batch file?](https://stackoverflow.com/questions/41454769/what-is-the-reason-for-sort-is-not-recognized-as-an-internal-or-external-comma) – user10089632 Sep 01 '17 at 22:13
1 Answers
-1
I think your apache spark is not extracted properly. Try to extract in some different way

Ambuj
- 92
- 1
- 9