2

Issue

When I try to run spark-shell I get a huge message error that you can see here : https://pastebin.com/8D6RGxUJ

Install

I used this tutorial, but I already have python and java installed. I used spark 3.2.0 instead.

Config : Windows 10

Butanium
  • 726
  • 5
  • 19

3 Answers3

2

My guess is that you have to put winutils.exe in the same folder as the $SPARK_HOME%\bin folder. I discovered that after starting from scratch and following this tutorial!

Nimantha
  • 6,405
  • 6
  • 28
  • 69
leo_val
  • 150
  • 8
  • 1
    Only moving the winutils.exe didn't work for me, I'll try starting from scratch with your tutorial later, thanks – Butanium Jan 15 '22 at 10:02
1

By following this answer for a similar question, I downgraded from spark 3.2.1 to 3.0.3 and this seems to have solved this problem.

Imri Paran
  • 56
  • 4
1

I managed to solve the problem with the following configuration:

  • Spark: spark-3.2.1-bin-hadoop2.7
  • Hadoop: winutils.exe and hadoop.dll (version 2.7.7 for both)
  • JDK: jdk-18.0.1

And I recommend that you put the environment variables in User, not System

richie101
  • 21
  • 2