I am trying to install spark
in standalone mode in Windows and when I'm trying to run the bin/spark-shell
command, it gives me the following error:

- 9,494
- 4
- 26
- 41

- 15
- 5
-
I'm voting to close this question as off-topic because it has to do with a specific software not correctly working instead of a programming issue – pagoda_5b Jan 16 '16 at 01:50
1 Answers
It appears you have downloaded the pre-built binaries for Linux and tried to run them in Windows? You could provide more detail on your setup, and please put the error in text so it's searchable, if this question lasts you want people Googling to be able to find it.
To run Spark on Windows you have to build it. There's a similar SO question here (perhaps a dupe? I can't mark it as such): How to set up Spark on Windows?
More to the point, here's the Spark docs on how to build Spark for Windows. Here's the relevant text from the section of the Overview page:
If you’d like to build Spark from source, visit Building Spark.
Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS). It’s easy to run locally on one machine — all you need is to have java installed on your system PATH, or the JAVA_HOME environment variable pointing to a Java installation.
Spark runs on Java 7+, Python 2.6+ and R 3.1+. For the Scala API, Spark 1.6.0 uses Scala 2.10. You will need to use a compatible Scala version (2.10.x).
If you are not prepared to build Spark from source code, then perhaps you want to use Virtual Box or VMWare and run a Linux VM, but that would probably only be good for testing in local[*] mode. Close to that link, "However, for local testing and unit tests, you can pass “local” to run Spark in-process."
You might get away with running a master/driver and worker/executor in a VM, but I would not expect it to play well in a network setting in a VM. Without trying it, it's hard for me to predict what specifically would go wrong, but I suspect serialization would be an issue, for starters.
Better yet get a cheap PC and install Linux and go from there.
-
Actually I am new to spark and trying to install in standalone mode .I downloaded the spark folder and given into local path .installed Scala also.I am getting the error like C:\spark-1.3.1-bin-hadoop2.6>bin\spark-shell Error: The system cannot find the path specified."" I tried in another machine, in that also I got another error called: no time to execute please suggest on this – Eswar Kumar Jan 12 '16 at 09:42
-
Standalone mode does not mean it "stands alone" from a cluster, it means Spark uses its own job manager instead of YARN or Mesos, therefore "stands alone." As far I as I can based on what your supplied, you have downloaded the Linux binaries, you need to build the Windows version from source. – JimLohse Jan 12 '16 at 17:54
-
@Eswar, further, please edit your question to use the SO editor properly so your errors are searchable, you could look at the tour page to get a good idea of how to ask good questions.http://stackoverflow.com/tour and http://stackoverflow.com/help/how-to-ask, then provide more detail, OK? – JimLohse Jan 12 '16 at 17:55
-
1Thanks JimLohe and i have downloaded the Linux libraries , it is working fine – Eswar Kumar Jan 14 '16 at 07:32