3

I'm trying to install Hadoop 2.2.0 on a Single Node Cluster on my computer using this tutorial http://codesfusion.blogspot.gr/2013/10/setup-hadoop-2x-220-on-ubuntu.html?m=1 . I follow every instruction I see, step-by-step but I have the same problem every time. NameNode , DataNode and SecondaryNameNode not running. The message I see when I enter start-dfs.sh , start-yarn.sh and jps is :

hduser@victor-OEM:/usr/local/hadoop/sbin$ start-dfs.sh
Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
Starting namenodes on []
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-victor-OEM.out
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-victor-OEM.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is 62:ec:99:e3:ce:2d:f8:79:1f:f8:9a:2a:25:9d:17:95.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (ECDSA) to the list of known hosts.
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-victor-OEM.out
hduser@victor-OEM:/usr/local/hadoop/sbin$ start-yarn.sh
starting yarn daemons
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-victor-OEM.out
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-victor-OEM.out
hduser@victor-OEM:/usr/local/hadoop/sbin$ jps
10684 NodeManager
10745 Jps
10455 ResourceManager
IrishDog
  • 460
  • 1
  • 4
  • 21

5 Answers5

2

Certain versions of the codefusion tutorial (such as this one) omit the xml tags within code blocks such that:

#add this to foo.txt   
<bizz>bar</bizz>

became:

#add this to foo.txt
bar

Including the xml tags in the configuration resolved the issue.

Ocasta Eshu
  • 841
  • 3
  • 11
  • 23
IrishDog
  • 460
  • 1
  • 4
  • 21
2

You can try this link:Leraning hadoop. It is for 0.23.9 but also works for 2.2.0

Rushikesh Garadade
  • 617
  • 1
  • 8
  • 32
1

Disable IPv6 in the hadoop-env.sh:

export HADOOP_OPTS=-Djava.net.preferIPv4Stack=true
Aaaaaaaa
  • 2,015
  • 3
  • 22
  • 40
0

I had the same problem.

I solved it by disabling firewalls.

Just use this command

sudo ufw disbale 
M. Dhaouadi
  • 617
  • 10
  • 27
-5

I tried below steps:

  1. ssh-keygen -t rsa -P ""

  2. cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys

After that, opening a new terminal and start the Hadoop cluster which solves my problem.

Nagarjuna D N
  • 541
  • 6
  • 26