I'm trying to install Apache Spark on Windows 10. I downloaded Spark and winutils.exe, set the SPARK_HOME, HADOOP_HOME, and updated the PATH variable to include Spark bin path. Still, when I run spark-shell I get the error below. What's the problem?
C:\tools\spark-2.1.1-bin-hadoop2.7\bin>spark-shell
'""C:\Program' is not recognized as an internal or external command,
operable program or batch file.