4

I'm using Hadoop MapReduce paradigm, and i need to get the NameNode IP address from the DataNode, can any one give me an idea how to do this?

Thanks.

Ani Menon
  • 27,209
  • 16
  • 105
  • 126
mohamus
  • 83
  • 1
  • 2
  • 10

2 Answers2

7

Easiest way would be to quickly open the core-site.xml file under HADOOP_HOME/conf directory. The value of fs.default.name property will tell you the host and port where NN is running.

Tariq
  • 34,076
  • 8
  • 57
  • 79
  • 1
    The **fs.default.name** gives the Localhost of the DataNode Tarek.we can use this only in the namenode. – mohamus Jun 04 '14 at 19:26
  • That means you are running in pseudo distributed mode(All processes on a single machine). In this case the NN machine is same as the DN machine. – Tariq Jun 04 '14 at 19:28
  • No i used it on a multicluster hadoop, and when i try to get the IP addresse from the DataNode it gives me 127.0.0.1/localhost – mohamus Jun 04 '14 at 19:36
  • Could you please show me your **configuration files**, along with the **slaves** file present inside HADOOP_HOME/conf directory? – Tariq Jun 04 '14 at 19:37
  • fs.default.name hdfs://localhost:54310 – mohamus Jun 04 '14 at 19:44
  • slaves localhost slave slave2 slave3 slave4 – mohamus Jun 04 '14 at 19:45
  • Although your slaves file contains multiple hostnames, I doubt if it is actually a fully distributed setup. Could you please point your web browser to the NN webUI(NN_machine:50070) and make sure of this? – Tariq Jun 04 '14 at 19:51
  • localhost:50070 and the datanode – mohamus Jun 04 '14 at 21:00
1

Delete the line 127.0.0.1 localhost in your /etc/host and put your IP and the name of all your machines. Hadoop is resolving all the IPs and names of machines on the cluster as 127.0.0.1 localhost if you leave the file as default.

Matthew Murdoch
  • 30,874
  • 30
  • 96
  • 127
Junayy
  • 1,130
  • 7
  • 13