2

I am new to Hadoop. Is there a bash command to transfer files from the Hadoop distributed file system to the standard file system on a hadoop node.

I am using Hadoop 2.6.0

I saw another similar question which asks how to do the same in Java: Copying files from HDFS to local file system with JAVA

Can we do it with a simple shell command instead (which runs on a node that is part of the hadoop cluster)?

Community
  • 1
  • 1
Pranjal Mittal
  • 10,772
  • 18
  • 74
  • 99
  • possible duplicate of [How to copy file from HDFS to the local file system](http://stackoverflow.com/questions/17837871/how-to-copy-file-from-hdfs-to-the-local-file-system) – Remus Rusanu Mar 03 '15 at 07:30
  • Hmm, this looks like a possible duplicate, but I just checked that the commands in the answers there do not work. Looks like hdfs command works for `hadoop 2.6.0` and `bin/hadoop fs` is deprecated. – Pranjal Mittal Mar 03 '15 at 07:36

2 Answers2

2

hdfs dfs -get /hdfs/path /local/path

hdfs dfs -put /local/path /hdfs/path

Pradeep Gollakota
  • 2,161
  • 16
  • 24
0

If you want to pull data down from HDFS to a local directory,you'll need to use the -get or -copyToLocal switches to the hadoop fs command.

hadoop fs -copyToLocal hdfs://path localpath

just call the command in shell scripting.you can do something like below.

for line in awk '/.csv/ {print $2}' /user/hadoop/TempFiles/CLNewFiles.txt;

do

hadoop fs copyToLocal /user/hadoop/TempFiles/$line yourlocalpath

echo "$line file is downloading from hadoop"

done

Sravan K Reddy
  • 1,082
  • 1
  • 10
  • 19