35

I have installed Hadoop and HDFS using this tutorial

http://codesfusion.blogspot.com/2013/10/setup-hadoop-2x-220-on-ubuntu.html

Everything is fine.

I am also able to create directories and use them using

hadoop fs -mkdir /tmp
hadoop fs -mkdir /small

I can also say

hadoop fs -ls /

However I am following a tutorial in which the trainer does

hadoop fs -mkdir temp
hadoop fs -ls

now on my machine when I issue the above command it says

ls: `.': No such file or directory

In my training video the command hadoop fs -ls works perfectly. Why should I specify the "/"?

Also I am getting this warning in all my commands

13/12/28 20:23:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

But in my trainers video there is no such warning.

My configuration file is exactly as the tutorial above and I can also see all management UIs at

http://abhishek-pc:8042/
http://abhishek-pc:50070/
http://abhishek-pc:8088/

So my question is what is wrong with my configuration and why is my system behaving differently than the training video?

Knows Not Much
  • 30,395
  • 60
  • 197
  • 373
  • have you formatted your namenode? – zhutoulala Dec 29 '13 at 02:56
  • yes. as per tutorial above I did `hdfs namenode -format` after that I created my own directories and also copied local files into HDFS and everything is fine. but I my training video can use URLs like tmp whereas I must use /tmp. training video can also do hdfs://machine:10001/data/tmp but I must do /tmp. – Knows Not Much Dec 29 '13 at 03:00

7 Answers7

102

Well, your problem regarding ls: '.': No such file or directory' is because, there is no home dir on HDFS for your current user. Try

hadoop fs -mkdir -p /user/[current login user]

Then you will be able to hadoop fs -ls

As per this warning WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable, please see my answer at this question

Community
  • 1
  • 1
zhutoulala
  • 4,792
  • 2
  • 21
  • 34
6

First:

hdfs dfs -mkdir /user

then perform

hdfs dfs -mkdir /user/hduser
icedtrees
  • 6,134
  • 5
  • 25
  • 35
user3364393
  • 61
  • 1
  • 1
2

Solved this. Run hadoop fs -ls as hdfs user(not as root user). #su - hdfs.

Gautam Pal
  • 21
  • 2
  • 1
    Can you pleas expand on the commands that are using, what they mean and what difference it makes to the stated problem? Thanks – CurlyPaul May 29 '14 at 10:22
1

I faced a similar type of problem during tutorial by hadoop form link-

http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html

when i tried command - bin/hdfs dfs -put etc/hadoop input , it says

mkdir: `input': No such file or directory

then problem solved by adding extra / to input and command should be -

bin/hdfs dfs -put etc/hadoop /input
naveen dahiya
  • 436
  • 5
  • 6
0

This could also happen due to bad carriage return characters. Run `dos2unix' on all your hdfs executable (shell script) and if required all other related shell scripts as well.

Pushkar
  • 541
  • 4
  • 18
0

First of all when you want for the first time to put something in your HDFS you should do this steps

  1. hdfs fs -mkdir -p /user/nameuser(the name of user )
  2. hdfs fs -put ~/file
majdouline
  • 11
  • 5
-1

after hdfs dfs -mkdir /user/[user name]

do :

hadoop fs -ls /

it's works form me !

dTatlvi
  • 61
  • 1
  • 5