1

My cluster is Ambari Hortonworks(ambari-server-2.0.1-45) HDP 2.2:

I want to backup a hdfs folder onto AWS S3. I used following command: hadoop distcp hdfs://internalip:8020/backup s3://AWS-ID:AWS-SECRET-KEY@BUCKET-NAME/DIRECTORY-NAME

I have tried following How can I access S3/S3n from a local Hadoop 2.6 installation?

But I am still getting the following error:

Error: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.s3.S3FileSystem not found
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2076)
    at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2601)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2614)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2653)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2635)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
    at org.apache.hadoop.tools.mapred.CopyMapper.setup(CopyMapper.java:112)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:142)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.tools.DistCp.execute(DistCp.java:175)
Community
  • 1
  • 1
sashmi
  • 97
  • 1
  • 2
  • 14
  • In java we can pass classpath as java -cp ... I want to run hadoop distcp giving classpath alongwith the hadoop distcp command. Is there a way to do it ? – sashmi Oct 22 '15 at 21:16

0 Answers0