0

I have a simple standard Java application that copies file from local file system to remote HDFS.

I was able to run the application successfully using yarn command. For e.g:

yarn jar test.jar copyFileApp

However, above command requires Hadoop to be installed locally, which I don't want. Is there any way I could run the app using native java. e.g:

java -cp test.jar copyFileApp

When I tried with native Java it's complaining that native Hadoop libraries are not found.

e.g.

14/09/23 16:36:11 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Is this possible?

user229044
  • 232,980
  • 40
  • 330
  • 338
diplomaticguru
  • 675
  • 3
  • 8
  • 19
  • Try looking up fatJar and build a jar with the hadoop libraries as part of the jar. There is a plugin for eclipse. – brso05 Sep 23 '14 at 14:53
  • Thank you for your response. I've already tried maven-shade-plugin to bundle all libs with the apps but no joy. – diplomaticguru Sep 23 '14 at 15:00
  • I've never tried maven but fatJar has worked really well for me... – brso05 Sep 23 '14 at 15:02
  • Okay thanks. I'll have a go at that. – diplomaticguru Sep 23 '14 at 15:03
  • Why not include other Hadoop libraries in classpath? – Venkat Sep 23 '14 at 18:13
  • @brso05 the binaries are not shipped with the jar, why should they be? Download a release and put it into your path, like the rest of the world does with libraries as well. – Thomas Jungblut Sep 23 '14 at 22:22
  • Possible duplicate of [Hadoop "Unable to load native-hadoop library for your platform" warning](http://stackoverflow.com/questions/19943766/hadoop-unable-to-load-native-hadoop-library-for-your-platform-warning) – eliasah Nov 20 '15 at 14:00

0 Answers0