1

I installed hadoop in the laptop and all the services are running except datanode. Initially namenode and secondary namenode was not running. I made some changes/permission on namenode and secondary namenode and now it is fine.

hduse@Lenovo-IdeaPad-S510p:/usr/local/hadoop/sbin$ jps
14339 NameNode
16579 Jps
15571 NodeManager
15076 SecondaryNameNode
15231 ResourceManager

Find my hdfs-site config file.

<configuration>
 <property>
  <name>dfs.replication</name>
  <value>1</value>
  <description>Default block replication.
  The actual number of replications can be specified when the file is created.
  The default is used if replication is not specified in create time.
  </description>
 </property>
 <property>
   <name>dfs.namenode.name.dir</name>
   <value>file:/usr/local/hadoop_store/hdfs/namenode</value>
 </property>
 <property>
   <name>dfs.datanode.data.dir</name>
   <value>file:/usr/local/hadoop_store/hdfs/datanode</value>
 </property>
</configuration>

Permission for both the directory.

hduse@Lenovo-IdeaPad-S510p:/usr/local/hadoop/sbin$ ls -ld /usr/local/hadoop_store/hdfs/namenode
drwxrwxrwx 3 hduser hadoop 4096 Nov 20 13:51 /usr/local/hadoop_store/hdfs/namenode

hduse@Lenovo-IdeaPad-S510p:/usr/local/hadoop/sbin$ ls -ld /usr/local/hadoop_store/hdfs/datanode
drwxrwxrwx 2 hduser hadoop 4096 Nov 17 14:10 /usr/local/hadoop_store/hdfs/datanode

And the logfile for data node.

hduse@Lenovo-IdeaPad-S510p:/usr/local/hadoop/sbin$ less /usr/local/hadoop/logs/hadoop-hduse-datanode-Lenovo-IdeaPad-S510p.log

......./*some data truncated*/......
STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1; compiled by 'jenkins' on 2014-11-13T21:10Z
STARTUP_MSG:   java = 1.8.0_66
************************************************************/
2015-11-20 13:51:42,778 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2015-11-20 13:51:43,305 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /usr/local/hadoop_store/hdfs/datanode : 
EPERM: Operation not permitted
        at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl(Native Method)
        at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:230)
        at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:652)
        at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:490)
        at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:140)
        at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156)
        at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2239)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2281)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2202)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2378)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2402)
2015-11-20 13:51:43,307 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/usr/local/hadoop_store/hdfs/datanode/" 
        at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2290)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2263)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2155)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2202)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2378)
        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2402)
2015-11-20 13:51:43,309 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2015-11-20 13:51:43,311 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at Lenovo-IdeaPad-S510p/127.0.1.1
************************************************************/

In the datanode file directory path has full permission. I tried clearing tmp directory , formatting namnode but still the datanode is not working.

Advice the same to do any changes for make datanode succesful.

Wanderer
  • 447
  • 3
  • 11
  • 20

1 Answers1

0
  1. first delete all contents from temporary folder: rm -Rf (my was /usr/local/hadoop/hadoop_tmp)
  2. format the namenode: bin/hadoop namenode -format
  3. start all processes again:bin/start-all.sh

While starting the hadoop first time you need to format the namenode. The problem is because of that.

Pramod Patil
  • 757
  • 2
  • 10
  • 26