0

I followed the instruction for installing pyspark on Windows, but am having a similar issue to How to troubleshoot 'pyspark' is not recognized... error on Windows? But my error message when I type 'pyspark' on cmd is:

'C:\Users\simon\AppData\Local\Programs\Python\Python310' is not recognized as an internal or external command, operable program or batch file

I've tried installing different python versions (and edited environment variables) but result is the same. Environment variables are set as fllows:

PYSPARK_PYTHON=C:\Users\simon\AppData\Local\Programs\Python\Python310
SPARK_HOME=C:\apps\spark-3.3.0-bin-hadoop3
HADOOP_HOME=C:\apps\spark-3.3.0-bin-hadoop3

Can you help track down this error?

Solved: adding \python.exe to PYSPARK_PYTHON solved the problem. It is very strange because every tutorial I saw did not include the .exe but oh well!

1 Answers1

0

Adding \python.exe to PYSPARK_PYTHON solved the problem.