0

I am using Hadoop 2.2 on my Ubuntu single node cluster. I have started hadoop cluster using start-all.sh. When I tried to load a text file in HDFS, it throws me following error.

hduser@ubuntu:~$ hadoop dfs -put /home/aditya/Desktop/data.txt
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

13/11/26 00:40:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
put: Call From ubuntu/127.0.1.1 to localhost:54310 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

These are my /etc/hosts file details. Plz check.

127.0.0.1       localhost
127.0.1.1       ubuntu

# The following lines are desirable for IPv6 capable hosts
::1     localhost ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
ff02::3 ip6-allhosts

I searched & tried to solve this error, but didn't get any success. Please help me with your ideas. Thank You.

Aditya
  • 2,299
  • 5
  • 32
  • 54
  • Just FYI, the native-hadoop library message is just a warning and unrelated to the actual error you have, which is connection refused. – Mike Park Nov 25 '13 at 23:03
  • Hey, thanks Climbage. May be the error is as you said. I have also searched for this error & tried to solve but didn't get any good results. The other thing I found is regarding the 32 bit 64 bit version of OS. The hadoop native library files are compiled on 32 bit version. And I am using 64 bit version Ubuntu. Is this an error cause ? – Aditya Nov 26 '13 at 02:27
  • Probably depends on whether you're using 32 or 64bit java – Mike Park Nov 26 '13 at 02:50
  • Yes. I am using 64 bit JVM. – Aditya Nov 26 '13 at 05:19

1 Answers1

1

What version of hadoop are you using? How many nodes do you have in the cluster? The error you're seeing usually results from /etc/hosts settings. Make sure all boxes can ping each other via name. I've removed all hostname to 127.0.1.1 mappings and bound hostnames to the IP in our small 2-node cluster (hadoop 2.2.0).

............

Please take a look at the stackoverflow link for /etc/hosts settings. Hadoop (local and host destination do not match) after installing hive

I strongly recommend looking at the Hadoop2 setup docs linked below since several things have changed.

Community
  • 1
  • 1
Vishal
  • 1,253
  • 1
  • 11
  • 17
  • Hi Vishal. I am using Hadoop 2.2. I have installed hadoop in Ubuntu (sigle node cluster) with the help of this site [link] http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/. My /etc/host file details I have added in question, plz check. – Aditya Nov 26 '13 at 02:19
  • That link does't have the latest and several things have changed for Hadoop/Yarn 2.x.x. The following links have more data. - http://hortonworks.com/wp-content/uploads/downloads/2013/06/Apache.Hadoop.YARN_.Sample.pdf - http://raseshmori.wordpress.com/2012/09/23/install-hadoop-2-0-1-yarn-nextgen/ – Vishal Nov 26 '13 at 04:08
  • Also, can you please try my suggestion to comment the ubuntu to 127.0.1.1 mapping and adding your IP instead? – Vishal Nov 26 '13 at 04:11
  • Thank you very much Vishal. Definitely I will go through the links provided by you (probably in the evening). – Aditya Nov 26 '13 at 05:18
  • Yes Vishal, it worked. The actual proble was in my /etc/hostname file. Hostname was ubuntu instead of localhost. Thanks for your links also, helped me out while installation. – Aditya Nov 27 '13 at 04:12
  • Vishal, in which configuration file should I put the $JAVA_HOME. Because while doing sbin/stop-all.sh, it is throwing an error, JAVA_HOME not set, though it is set in my bash file. Thank You. – Aditya Nov 27 '13 at 04:15
  • I've added JAVA_HOME (/usr/local/java in my case) at 2 spots: .bashrc for my bash shell and hadoop_env.sh file. I'm away from my setup and can send the exact data tomorrow. Also, I start and stop all my services independently (4 calls for namenode, datanode, resourcemanager and nodemanager) to make sure things are ok. Check the logs under hadoop/logs/*.log and the monitoring portals during startup. – Vishal Nov 27 '13 at 06:23
  • hadoop-env.sh: # The java implementation to use. export JAVA_HOME=/usr/local/jdk1.7.0_45 .bashrc: # JAVA export JAVA_HOME=/usr/local/jdk1.7.0_45 – Vishal Nov 27 '13 at 19:02
  • Thank You Vishal. The links & the PDF provided by you are very helpful. I would really appreciate if you provide some more links if you have any for Hadoop learning. Thanks alot again. – Aditya Nov 28 '13 at 16:57