I'm trying to install spark on windows 10 but I have an error while I try to run pyspark.
Failed to find Spark jars directory. You need to build Spark before running this program.
I've followed the steps indicated Here until the step 4.
I went to Anaconda's Scripts and site-packages. In Scripts there are pyspark spark-shell and so on, BUT the pyspark folder at site-packages doesn't have neither the jars folder or its own bin folder.
Where are the jars?