2

I am trying to install hadoop in my windows 10

Reference : https://github.com/MuhammadBilalYar/Hadoop-On-Window/wiki/Step-by-step-Hadoop-2.8.0-installation-on-Window-10

Hadoop start-all.cmd command start namenode , resourceManager and nodeManager successfully but datanode not started

Error ::

checker.StorageLocationChecker: Exception checking StorageLocation [DISK]file:/C:/hadoop-3.1.1/data/datanode
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$POSIX.stat(Ljava/lang/String;)Lorg/apache/hadoop/io/nativeio/NativeIO$POSIX$Stat;
        at org.apache.hadoop.io.nativeio.NativeIO$POSIX.stat(Native Method)
        at org.apache.hadoop.io.nativeio.NativeIO$POSIX.getStat(NativeIO.java:455)
        at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfoByNativeIO(RawLocalFileSystem.java:796)
        at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:710)
        at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:678)
        at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:233)
        at org.apache.hadoop.util.DiskChecker.checkDirInternal(DiskChecker.java:141)
        at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:116)
        at org.apache.hadoop.hdfs.server.datanode.StorageLocation.check(StorageLocation.java:239)
        at org.apache.hadoop.hdfs.server.datanode.StorageLocation.check(StorageLocation.java:52)
        at org.apache.hadoop.hdfs.server.datanode.checker.ThrottledAsyncChecker$1.call(ThrottledAsyncChecker.java:142)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
2018-12-28 11:19:03,023 ERROR datanode.DataNode: Exception in secureMain
org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 0, volumes configured: 1, volumes failed: 1, volume failures tolerated: 0
        at org.apache.hadoop.hdfs.server.datanode.checker.StorageLocationChecker.check(StorageLocationChecker.java:220)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2762)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2677)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2719)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2863)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2887)
2018-12-28 11:19:03,031 INFO util.ExitUtil: Exiting with status 1: org.apache.hadoop.util.DiskChecker$DiskErrorException: Too many failed volumes - current valid volumes: 0, volumes configured: 1, volumes failed: 1, volume failures tolerated: 0
2018-12-28 11:19:03,079 INFO datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at BHARTI/192.168.2.161
************************************************************/
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Bharti Ladumor
  • 1,624
  • 1
  • 10
  • 17
  • Note: The blog you are reading uses Hadoop 2.8, and you are using Hadoop 3.1.1, which has differernces in how services are configured and ran – OneCricketeer Dec 28 '18 at 18:12
  • @cricket_007 i changed by hadoop version than also i got same error – Bharti Ladumor Jan 07 '19 at 08:20
  • Well, I mean "failed volume" sounds like a hardware problem, which isn't really a thing Hadoop or software itself can solve – OneCricketeer Jan 08 '19 at 18:28
  • yes, either the configured volume is just not existing currently or there exists a hardware-problem with the volume. Currently it's configured as `/C:/hadoop-3.1.1/data/datanode`, you've installed windows on another drive (i.e. `D:` or `E:`)? – David Jan 13 '19 at 10:16
  • @David , no installed window on same drive that is C: drive – Bharti Ladumor Jan 15 '19 at 03:52
  • i was try to installed hadoop 2.8.0 and refered link https://github.com/MuhammadBilalYar/Hadoop-On-Window/wiki/Step-by-step-Hadoop-2.8.0-installation-on-Window-10 and my hadoop working absolutly fine – Bharti Ladumor Jan 15 '19 at 04:53

3 Answers3

0

I sucessfully installed hadoop 2.8.0 from

reference : https://github.com/MuhammadBilalYar/Hadoop-On-Window/wiki/Step-by-step-Hadoop-2.8.0-installation-on-Window-10

Bharti Ladumor
  • 1,624
  • 1
  • 10
  • 17
0

I installed hadoop 2.8.0

reference:https:Hadoop on windows

be careful while changing the xml file or while coping it from the site ,changes are still needed for the etc file from the site.

I was getting the same error ,then i saw that i didn't provide correct path values for datanode and namenode in hdfs-site.xml file,after correction it is working fine

0

I am posting what worked for me. In etc/hadoop/core-site.xml keep the following configuration

<configuration>
<property>
    <name>fs.defaultFS</name>
    <value>hdfs://localhost:9000</value>
</property>

In etc/hadoop/hdfs-site.xml keep the following configuration

    <configuration>
  <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
<property> 
<name>dfs.namenode.name.dir</name> 
<value>file:///C:/hadoop-3.2.2/data/namenode</value> //Your path to namenode may be different 
</property> 

<property> 
<name>dfs.datanode.data.dir</name> 
<value>datanode</value>
</property>
</configuration>

Now run the command hdfs namenode -format in bin directory and in the sbin directory run start-dfs.cmd. Datanode should run now if there is no error.

If you can't upload the file you need to change the permission. For example let say you created a directory called user using the command hdfs dfs -mkdir user. By default the permission will be 'drwxr-xr-x'. You need to change that using the command hdfs dfs -chmod 777 /user. The permission will be set to 'drwxrwxrwx'. Now you can able to upload and download the file.

Nithish D
  • 1
  • 2