-1

the problem that I am facing is that when I give this command "hadoop fs -ls" , it throws this message , "ls: `.': No such file or directory ".

For reference Output result to my "jps" command is

18276 SecondaryNameNode
19684 Jps
17942 NameNode
18566 NodeManager
18441 ResourceManager
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Gaurav A Dubey
  • 641
  • 1
  • 6
  • 19
  • Possible duplicate of [Hadoop 2.2 Installation \`.' no such file or directory](http://stackoverflow.com/questions/20821584/hadoop-2-2-installation-no-such-file-or-directory) – OneCricketeer Mar 26 '17 at 14:45
  • Or http://stackoverflow.com/questions/28241251/hadoop-fs-ls-results-in-no-such-file-or-directory/28260263#28260263 – OneCricketeer Mar 26 '17 at 14:48
  • @cricket_007 yes I tried it... and yes the Duplicate link one that you posted solved my issue... Thanks all... – Gaurav A Dubey Mar 27 '17 at 15:58

2 Answers2

1

First you should have a data node running which stores the data otherwise you will not be able to deal with hadoop fs (File System).

Try to stall all services

$start-all-sh
$jps

Ensure that data node is running and nothing obstacles it

Then try

$hadoop fs -ls /
Mosab Shaheen
  • 1,114
  • 10
  • 25
  • Yep. There was the directory missing... I tried above step and it didn't resolve the error... – Gaurav A Dubey Mar 27 '17 at 15:56
  • @GauravADubey Did you ensure that the data node is running and that you formatted the namenode: hadoop namenode -format – Mosab Shaheen Mar 28 '17 at 09:28
  • @MosabShaheen Yes I did that. And it completed everything smoothly. The step you mentioned worked once I created the directory using hadoop fs -mkdir -p /user/mysystemname – Gaurav A Dubey Mar 29 '17 at 17:10
  • @GauravADubey Great. When I setup hadoop there was already a folder under /user/ holding the user name. But it is not mandatory to use that folder you can work on the root directory for testing that's why I wrote: $hadoop fs -ls / – Mosab Shaheen Mar 29 '17 at 18:28
1

When you don't pass any argument to this hadoop fs -ls command, the default hdfs directory it tries to list is /user/{your_user_name}

The problem in your case could be that this hdfs directory does not exist.

Try running hadoop fs -ls /user/ to see which directories are created for which users.

You can also just create your user's hdfs default directory. Running the below command will fix your error:

hadoop fs -mkdir -p /user/$(whoami)

PetrosP
  • 635
  • 6
  • 15
  • This would need to be ran as the `hdfs` or `hadoop` user, not the current logged in user, depending on installation process – OneCricketeer Mar 26 '17 at 14:47
  • Hi.. Yes the problem was that there was no directory for hadoop and so created it using -mkdir statement. After that things started working... – Gaurav A Dubey Mar 27 '17 at 15:55
  • Alsthough I think stop-all.sh or stopping hdfs and yarn individually was needed else it was throwing some issue. – Gaurav A Dubey Mar 27 '17 at 15:55