1

I'm well aware how to use

hadoop fs -get ..... in unix

How would you get a file from amazon aws (HDFS) using java ?

This is a distance server.

But i need to supply somehow a (ppk file).

How can it be done in java ?

addition to the code from this link .

2Big2BeSmall
  • 1,348
  • 3
  • 20
  • 40
  • Edited the grammar and code markup – dingo_d Nov 26 '15 at 07:32
  • Are you looking to get a file into HDFS from amazon aws ? – sras Nov 26 '15 at 10:54
  • I need to get files from hdfs in java. Without ftp get. – 2Big2BeSmall Nov 26 '15 at 11:01
  • you need to use hadoop file system API (`org.apache.hadoop.fs.FileSystem`. Look into `copyToLocalFile` method described here https://hadoop.apache.org/docs/r2.6.2/api/org/apache/hadoop/fs/FileSystem.html#copyToLocalFile(boolean,%20org.apache.hadoop.fs.Path,%20org.apache.hadoop.fs.Path) – sras Nov 26 '15 at 12:18
  • i cant use unix command - i am working from a distant server - and need to use java only – 2Big2BeSmall Nov 26 '15 at 12:36

1 Answers1

1

Expanding upon what 'sras' has already noted. You have to use org.apache.hadoop.fs.FileSystem API. That API can be remotely invoked so that you can to connect to HDFS from a remote host.

The following stackoverflow question actually has a code snippet and explanation on authenticating your request in some detail.

HDFS access from remote host through Java API, user authentication

(You wont be connecting to the server over SSH using a PPK file - but over HTTP to HDFS daemon)

Community
  • 1
  • 1
user1452132
  • 1,758
  • 11
  • 21