7

alpesh@alpesh-Inspiron-3647:~/hadoop-2.7.2/sbin$ hadoop fs -ls 16/07/05 13:59:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

It is also showing me the the output as below

hadoop check native -a 16/07/05 14:00:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Native library checking: hadoop: false zlib: false snappy: false lz4: false bzip2: false openssl: false 16/07/05 14:00:42 INFO util.ExitUtil: Exiting with status 1

Please help me to solve this

BruceWayne
  • 3,286
  • 4
  • 25
  • 35
Kalpesh Bhosale
  • 71
  • 1
  • 1
  • 2

4 Answers4

12

Library you are using is compiled for 32 bit and you are using 64 bit version. so open your .bashrc file where configuration for hadoop exists. Go to this line

export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"

and replace it with

export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib/native"
Rakesh Kumar
  • 4,319
  • 2
  • 17
  • 30
  • 1
    Thanks Rakesh , I had did this but still it show me following error. after giving command like hadoop fs -ls 16/07/05 14:41:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable ls: `.': No such file or directory – Kalpesh Bhosale Jul 05 '16 at 09:11
  • ok, after editing this file make sure to run this command (source .bashrc) and then check – Rakesh Kumar Jul 05 '16 at 09:12
  • Yes , now this native msg is solved , Please help me to run hadoop command . It is show error as follows hadoop fs -mkdir sam mkdir: `sam': No such file or directory – Kalpesh Bhosale Jul 05 '16 at 09:28
  • Run it like this hadoop fs -mkdir /sam It is just like our linux filesystem. – Rakesh Kumar Jul 05 '16 at 09:30
  • Thanks Rakesh please tell me how to create txt file into hadoop – Kalpesh Bhosale Jul 05 '16 at 10:47
  • Dear Rakesh , when I am tring to copy file from local to hdfs it showing me File /home/kalpesh/kalpesh.txt._COPYING_ could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation. – Kalpesh Bhosale Jul 05 '16 at 10:58
  • before starting things did you formatted your name node? – Rakesh Kumar Jul 05 '16 at 11:05
  • http://stackoverflow.com/questions/26545524/there-are-0-datanodes-running-and-no-nodes-are-excluded-in-this-operation – Rakesh Kumar Jul 05 '16 at 12:18
  • HADOOP_OPTS wasnt at bashrc. Instead at /etc/hadoop/hadoop_env.sh and upon addition of the above line it still gave the same error. Running on Fedora 25 64bit Apache Spark2.1 hadoop 2.7 – Chaitanya Bapat Mar 12 '17 at 02:07
  • what is the beat way for mac El capiton ? any thoughts about that – ggorantl May 11 '17 at 23:29
1

To get rid of this error:

Suppose Jar file is at /home/cloudera/test.jar and class file is at /home/cloudera/workspace/MapReduce/bin/mapreduce/WordCount, where mapreduce is the package name.

Input file mytext.txt is at /user/process/mytext.txt and output file location is /user/out.

We should run this mapreduce program in following way:

$hadoop jar /home/cloudera/bigdata/text.jar mapreduce.WordCount /user/process /user/out
Jake Lee
  • 7,549
  • 8
  • 45
  • 86
0

Add these properties in bash profile of Hadoop user, the issue will be solved

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native

export HADOOP_OPTS="-Djava.library.path=$HADOOP_COMMON_LIB_NATIVE_DIR"

Community
  • 1
  • 1
dilshad
  • 734
  • 1
  • 10
  • 27
-2

It's just a warning, because it can not find the correct .jar. Either by compiling it or because it does not exist.

If I were you, I would simply omit it

To do that add in the corresponding configuration file

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
Anxo P
  • 759
  • 7
  • 12