I am trying to execute a Java class by executing a jar
file through java -jar abc.jar
but whenever I execute it I get error as :
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
My hadoop-env.sh
file in /etc/hadoop/conf
directory contains HADOOP_CLASSPATH
as:
export HADOOP_CLASSPATH="/home/cloudera/commons-logging-1.1.3.jar:/home/cloudera/hadoop-common-2.4.1.jar:hadoop-core-1.2.1.jar:$HADOOP_CLASSPATH"
In the Java code I have certain imports like:
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.FileSystem;
and I think it is not able to get the definitions of these at run time. what is causing these errors and how do I resolve this issue?
NOTE: I am new to hadoop and as such I do not know what and how to set different environment variables needed. Any example will be helpful for me.
UPDATE: After I executed java -cp $HADOOP_CLASSPATH:abc.jar:/home/extra_jars/* mypackage.classname
I get error as java.io.IOException: Mkdirs failed to create /user/example
In my code I am executing fs.copyFromLocalFile(false,true,sourcePath,targetPath);
and the targetPath
is /user/example/ex1.csv
. I am unable to understand why this error is coming even though my /user/example
has full red-write permissions.