60

trying to run MR program version(2.7) in windows 7 64 bit in eclipse while running the above exception occurring . I verified that using 64 bit 1.8 java version and observed that all the hadoop daemons are running.

Any suggestions highly appreciated

blackbishop
  • 30,945
  • 11
  • 55
  • 76
sukumar konduru
  • 599
  • 1
  • 4
  • 4
  • It seems that java unable to find any native library. May be any dll file is corrupted or not accessible by java program. – Anil Agrawal Jan 25 '17 at 12:32
  • There is a similar issue , you can check here http://stackoverflow.com/questions/18630019/running-apache-hadoop-2-1-0-on-windows – Anil Agrawal Jan 25 '17 at 12:32
  • in that case while starting daemons this problem occurring .but in my case no issues while starting but while executing map reduce program that exception occurring – sukumar konduru Jan 25 '17 at 13:05
  • It seems any native library that is required for map reduce program is not accessible. Please share stack trace for more information. – Anil Agrawal Jan 25 '17 at 19:22

19 Answers19

61

In addition to other solutions, Please download winutil.exe and hadoop.dll and add to $HADOOP_HOME/bin. It works for me.

https://github.com/steveloughran/winutils/tree/master/hadoop-2.7.1/bin

Note: I'm using hadoop-2.7.3 version enter image description here

enter image description here enter image description here

Thirupathi Chavati
  • 1,711
  • 12
  • 10
  • I have encountered the same problem while starting Namenode. Iam unable to start Namenode. Please help – Jon Andrews Jun 04 '20 at 06:53
  • Iam using Hadoop-3.0.0. I had downloaded hadoop.dll and winutils.exe from the Github link you had provided. But still no luck. It throws the same error while starting Namenode – Jon Andrews Jun 04 '20 at 06:55
  • 4
    The link above doesn't have 3.1.1. (yet?). Found this one that seems to be working for the newest version: https://github.com/kontext-tech/winutils – Dan Mar 15 '21 at 19:35
  • @JonAndrews - were you able to sort it out at the end? Having the same issue, the same hadoop version... Thanks! – alekseevi15 Mar 29 '21 at 11:55
  • I am using hadoop 3.3.0 and spark-3.3.0-bin-hadoop3. Still I am getting error. Can anyone help me on this – Avinash Sep 16 '22 at 11:29
  • it works for me (spark 3.3.1) – Егор Лебедев Dec 10 '22 at 18:00
  • Nothing worked until I restarted the computer. I didn't understand what it changed, but the error disappeared. Maybe it will help someone – Razorfever Feb 19 '23 at 16:36
53

After putting hadoop.dll and winutils in hadoop/bin folder and adding the folder of hadoop to PATH, we also need to put hadoop.dll into the C:\Windows\System32 folder

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
JasonWayne
  • 1,724
  • 1
  • 19
  • 16
  • 1
    Have done this but the error still pop up. Any suggestions? – Jaison Apr 13 '20 at 06:20
  • 1
    I didn't needed the `C:\Windows\System32` step. `WinUntils.exe` and `hadoop.dll` in the Hadoop bin folders did it for me. The environment variable and path setting are critical though. – Dan Mar 15 '21 at 19:32
  • 5
    @Dan Dunno why, but putting hadoop.dll into the C:\Windows\System32 folder was critical for me as well. – diman82 Jul 10 '21 at 22:36
  • @diman82 Did you have the environment variable `HADOOP_HOME` and path set-up and restarted? (Not critical, things are running for you; I'm simply curious what's causing the difference). – Dan Jul 13 '21 at 00:49
  • 1
    No need to put hadoop.dll to C:\Windows\System32. It should work if you have hadoop.dll in your Path variable. When you put it into System32 it will be definitely in Path. – Vadim Zin4uk Jan 04 '23 at 09:50
26

This issue occurred to me and the cause was I forgot to append %HADOOP_HOME%/bin to PATH in my environment variables.

Julius Delfino
  • 991
  • 10
  • 27
  • It's worth noting that the ``HADOOP_HOME`` user or system variable has to be created first before the above can be added to the ``PATH``. Typically, ``HADOOP_HOME`` would point to where your hadoop is installed such as ``C:\hadoop-2.8.1``. And as always with changing the ``PATH`` variable, a restart is usually required. – Sal Jul 23 '19 at 20:22
10

In my case I was having this issue when running unit tests on local machine after upgrading dependencies to CDH6. I already had HADOOP_HOME and PATH variables configured properly but I had to copy the hadoop.dll to C:\Windows\System32 as suggested in the other answer.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
OsvaldoP
  • 101
  • 1
  • 2
6

After trying all the above, things worked after putting hadoop.dll to windows/System32

4

For me this issue was resolved by downloading the winutils.exe & hadoop.dll from https://github.com/steveloughran/winutils/tree/master/hadoop-2.7.1/bin and putting those in hadoop/bin folder

Aman Tandon
  • 1,379
  • 2
  • 13
  • 26
4

Adding hadoop.dll and WinUntils.exe fixed the error, support for latest versions can be found here

  • 2
    Your answer could be improved with additional supporting information. Please [edit] to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Jan 20 '22 at 09:44
3

I already had %HADOOP_HOME%/bin in my PATH and my code had previously run without errors. Restarting my machine made it work again.

Ben Watson
  • 5,357
  • 4
  • 42
  • 65
2

The version mismacth is main cause for this issue. Follow complete hadoop version with java library will solve the issue and if you still face issue and working on hadoop 3.1.x version use this library to download bin

https://github.com/s911415/apache-hadoop-3.1.0-winutils/tree/master/bin

1

I already had %HADOOP_HOME%/bin in my PATH. Adding hadoop.dll in Hadoop/bin directory made it work again.

ahmedshahriar
  • 1,053
  • 7
  • 25
0

In Intellij under Run/Debug Configurations, open the application you are trying to run, Under configurations tab, specify the exact working Directory.having the variable to represent the working directory also creates this problem. When I changed the Working Directory under configurations, it started working again.

hemanth
  • 1
  • 3
0

Yes this issues arose when I was using the PIGUNITS for automation of PIGSCRIPTS. Two things in sequence need to be done:

  1. Copy both the files as mentioned about in a location and add it to the environment variables under PATH.

  2. To reflect the change what you have just done, you have to restart your machine to load the file.

Under JUNIT I was getting this error which would help others as well:

org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias XXXXX. Backend error : java.lang.IllegalStateException: Job in state DEFINE instead of RUNNING at org.apache.pig.PigServer.openIterator(PigServer.java:925)

kenlukas
  • 3,616
  • 9
  • 25
  • 36
ravi
  • 15
  • 3
0

This is what worked for me: Download the latest winutils https://github.com/kontext-tech/winutils Or check your spark Release text, it shows the cer of Hadoop it is using

Steps

  1. Dowload repo

  2. Create a folder named hadoop anywhere (e.g. desktop/hadoop)

  3. Paste the bin into that folder (you will then have hadoop/bin)

  4. copy hadoop.dll to windows/system32

  5. Set system environment:

    set HADOOP_HOME=c:/desktop/hadoop 
    set PATH=%PATH%;%HADOOP_HOME%/bin;
    
Ardan
  • 11
  • 3
0

For me, I have to download the winutils from https://github.com/kontext-tech/winutils as it has the latest version 3.3.

It is important to make sure the version matches to the Hadoop version you downloaded (with or without Spark) otherwise, you can find some weird error messages.

Both hadoop.dll and winutils.exe are fine to be at the same folder C:/hadoop/bin. I didn't copy either to system folder and it works.

Note: I followed this except the download pages of winutils tool.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
maoyang
  • 1,067
  • 1
  • 11
  • 11
0

After downloading and configuring hadoop.dll and wintuils.exe as previous answer, you need to "restart the windows" to make it works.

Bao Nguyen
  • 13
  • 2
  • 1) Download from where? What is there to "configure"? 2) You should only need to edit your PATH and restart the terminal, not the whole OS – OneCricketeer Mar 29 '23 at 14:29
0

In my case (pyspark = 3.3.1, Spark version = 3.3.1, Hadoop version = 3.3.2) I set env vars by python code

os.environ['PYSPARK_PYTHON'] = sys.executable
os.environ['HADOOP_HOME'] = "C:\\Program Files\\Hadoop\\"

added from https://github.com/kontext-tech/winutils to bin folder last version of hadoop files hadoop-3.4.0-win10-x64 and added hadoop.dll to C:\Windows\System32

Yegor
  • 1
  • You should be using the 3.3.2 winutils, since that's what you have installed, not a version of it for an unreleased Hadoop version – OneCricketeer Mar 29 '23 at 14:24
0

I'm using Spark version 3.1.2 and hadoop 3.2 (spark-3.1.2-bin-hadoop3.2).

I solved this by just downloading the hadoop.dll file from a Github page = https://github.com/cdarlint/winutils and saving it to the bin folder within my spark folder. Then spark-submit went smooth.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
0

Whereas all the above recommendations failed to resolve this error for me. Setting the environment variable HADOOP_BIN_PATH resolved the error for me. Also another related error where the word access0 in the error string is replaced with createDirectoryWithMode0.

If you look in the hadoop/bin folder and inspect hdfs.cmd and mapred.cmd it is clear that the HADOOP_BIN_PATH environment variable is expected. For example this code:

if not defined HADOOP_BIN_PATH ( 
  set HADOOP_BIN_PATH=%~dp0
)

%~dp0 should expand to the folder containing the cmd file. However, you could imagine other components of hadoop such as the daemons that do not start from the command line which may also expect this environment variable to be set.

Set HADOOP_BIN_PATH to the bin folder under your hadoop directory.

If you set a user environment variable you will need to restart the process running your application. If you set a system environment variable you will need to restart Windows to see the effect.

-2

This might be old but if its still not working for someone, Step 1- Double click on winutils.exe. If it shows some dll file is missing, download that .dll file and place that at appropriate place.

In my case, msvcr100.dll wasmissing and I had to install Microsoft Visual C++ 2010 Service Pack 1 Redistributable Package to make it work. All the best