14

I submit my mapreduce jobs from a java application running on windows to the hadoop 2.2 cluster running on ubuntu. In hadoop 1.x this worked as expected but on hadoop 2.2 I get a strange Error:

java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z

I compiled the necesary windows libraries (hadoop.dll and winutils.exe) and can access the hdfs via code and read the cluster information using hadoop API. Only the job submission does not work.

Any help is aprecciated.

Solution: I found it out myself, the path where the windows hadoop binaries can be found has to be added to the PATH variable of windows.

padmalcom
  • 1,156
  • 3
  • 16
  • 30
  • 1
    Hi add msvcr100.dll file to '${HADOOP_HOME}\bin' path.. me too face same problem.. – ǨÅVËĔŊ RĀǞĴĄŅ Aug 01 '15 at 06:50
  • 2
    I think the answer at http://stackoverflow.com/a/23959201/411846 might help you here, it shows how you can check if there are some MSVC system libraries missing on your box. – centic Sep 04 '15 at 06:59
  • possible duplicate of [Running Apache Hadoop 2.1.0 on Windows](http://stackoverflow.com/questions/18630019/running-apache-hadoop-2-1-0-on-windows) – centic Sep 04 '15 at 06:59
  • Possible duplicate of [Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z](https://stackoverflow.com/questions/41851066/exception-in-thread-main-java-lang-unsatisfiedlinkerror-org-apache-hadoop-io) – 10465355 Nov 18 '18 at 19:41

3 Answers3

4
  1. Get hadoop.dll (or libhadoop.so on *x). Make sure to match bitness (32- vs. 64-bit) with your JVM.
  2. Make sure it is available via PATH or java.library.path.

    Note that setting java.library.path overrides PATH. If you set java.library.path, make sure it is correct and contains the hadoop library.

rustyx
  • 80,671
  • 25
  • 200
  • 267
2

This error generally occurs due to the mismatch in your binary files in your %HADOOP_HOME%\bin folder. So, what you need to do is to get hadoop.dll and winutils.exe specifically for your hadoop version.

Get hadoop.dll and winutils.exe for your specific hadoop version and copy them to your %HADOOP_HOME%\bin folder.

Vijay
  • 117
  • 1
  • 9
0

I have been having issues with my Windows 10 Hadoop installation since morning where the NameNode and DataNode were not starting due the mismatch in the binary files. The issues were resolved after I replaced the bin folder with the one that corresponds with the version of my Hadoop. Possibly, the bin folder I replaced with the one that came with the installation was for a different version, I don't know how it happened. If all your configurations are intact, you might want to replace the bin folder with a version that correspond with your Hadoop installation.

K. SOT
  • 27
  • 6