3

I am following the Udacity's course on Hadoop which instructs using the command hadoop fs -ls to list files. But on my machine running Ubuntu, it instead list files in the present working directory. What am I doing wrong?

which hadoop commands gives the output: /home/usrname/hadoop-2.5.1//hadoop

Are the double slashes in the path the cause of this problem?

Community
  • 1
  • 1
dev
  • 2,474
  • 7
  • 29
  • 47
  • your files would be exactly same as hdfs? – SMA Dec 29 '14 at 07:08
  • @almasshaikh: I ruled out that possibility by trying the command from different directories. – dev Dec 29 '14 at 07:09
  • 1
    Paste output of following commands: ls; hadoop fs -ls; alias hadoop; which hadoop – SMA Dec 29 '14 at 07:11
  • `ls` - list files in current directory; `hadoop fs -ls` same as ls. `alias hadoop` bash:alias: hadoop: not found; `which hadoop`/home/usrname/hadoop-2.5.1/bin//hadoop (suspect double slashes in the path might be a problem?) – dev Dec 29 '14 at 07:18

4 Answers4

2

You file system must be pointing to local file system. Just modify the configuration to point it to HDFS and restart the processes.

Check this configuration:

 <property>
    <name>fs.default.name</name>
    <value>hdfs://<IP>:<Port></value>
</property>
Ashish
  • 5,723
  • 2
  • 24
  • 25
1

You have to setup path for hadoop root folder in your current users .bashrc file something as

export HADOOP_HOME=/home/seo/hadoop/hadoop-1.2.1

then add it to your system path variable as

export PATH=$PATH:$HADOOP_HOME/bin:

And then when you use

hadoop fs -ls

will list your hdfs file system file if your hadoop cluster is up and running.

Rajen Raiyarela
  • 5,526
  • 4
  • 21
  • 41
  • Not getting any error. It just lists the files in the current working directory instead of listing the files in HDFS. – dev Dec 29 '14 at 13:39
1

It's likely that your client is not picking up the correct hadoop configuration files which is why it defaults to your local filesystem.

Set HADOOP_CONF_DIR to the directory of the hadoop configuration files. Also verify that fs.defaultFS is specified correctly in core-site.xml.

0

Can you please try running the below command? Please do so after checking that the configuration suggested by Ashish is available in your core-site.xml.

hadoop dfs -ls hdfs://IP:PORT/

Thanks Arani

Arani
  • 182
  • 1
  • 1
  • 11