0

I am getting this error during my install of hadoop 3.2.1

I was told to follow this guide: https://towardsdatascience.com/installing-hadoop-3-2-1-single-node-cluster-on-windows-10-ac258dd48aef Everything is working until Step 6. when I try to run .\start-dfs.cmd

I have found solutions, but none worked.

All native files are added to bin from the github site (hadoop.dll etc) also added hadoop.dll to System32 folder

I also tried to edit the src code in InteliJ (adding a package org.apache.hadoop.io.nativeio) and creating a NativeIO.java class as the guide below:

  1. Obtain and patch Hadoop Hadoop is compatible with JDK 7 and 8. You may check if you have one (or more) of them installed. The binary of Hadoop can be found at http://hadoop.apache.org/releases.html. Extract all files to a safe place. For example, C:\hadoop-2.7.3. Then download patch files from https://github.com/srccodes/hadoop-common-2.2.0-bin/archive/master.zip. Extract (and replace) all files to the bin folder of your Hadoop binary. In this example, the full path will be C:\hadoop-2.7.3\bin.

  2. Developing with IntelliJ IDEA The IntelliJ IDEA is a powerful IDE for Java development. You may have it already installed on your working computer. If not, grab one from https://www.jetbrains.com/idea/. The “Community” version if free for non-commercial uses. A nice step-by-step guide about this part can be found at https://mrchief2015.wordpress.com/2015/02/09/compiling-and-debugging-hadoop-applications-with-intellij-idea-for-windows/.

  3. Troubleshooting: 1. You get an exception when debugging that says

Exception in thread "main" java.io.IOException: (null) entry in command string: null chmod 0700 ...
 at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:769)

Solution: You need to set an HADOOP_HOME environment variable. You can either add it as system-wide, or only for IntelliJ as shown in Setting up and using environment variables in IntelliJ Idea. The name of the environment variable should be HADOOP_HOME, while its value is the folder path of your Hadoop binary (in this example, the value is C:\hadoop-2.7.3).

  1. 3.2 You may get another exception that says

Exception in thread "main" java.lang.UnsatisfiedLinkError:
org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
 at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)

To solve this issue, you need to download the source package of Hadoop from given link in Sec. 1, extract and copy hadoop-2.7.3-src.tar.gz\hadoop-2.7.3-src\hadoop-common-project\hadoop-common\src \main\java\org\apache\hadoop\io\nativeio\NativeIO.java to your project, and modify Line 609 (in function access) from

return access0(path, desiredAccess.accessRight());

to

return true; 

What to do?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
  • 1
    I can't tell from your post what's actually your question, but for the link error, I suggest you use a Linux VM or WSL2 since Hadoop doesn't *run* well with Windows anyway, despite being able to initially *install it*. Plus, using 2.7.3 code with Hadoop3 won't work – OneCricketeer Feb 14 '21 at 15:58

0 Answers0