I previously had PySpark installed as a Python package I installed through pip, I uninstalled it recently with a clean version of Python and downloaded the standalone version.
In my User variables I made a path with name: SPARK_HOME
with a value of: C:\spark-2.3.2-bin-hadoop2.7\bin
In System variables under Path I made an entry: C:\spark-2.3.2-bin-hadoop2.7\bin
I can not run spark-shell either. Any ideas?