21

I followed "http://codesfusion.blogspot.com/2013/10/setup-hadoop-2x-220-on-ubuntu.html" to install hadoop on ubuntu. But, upon checking the hadoop version I get the following error:

Error: Could not find or load main class org.apache.hadoop.util.VersionInfo

Also, when I try: hdfs namenode -format

I get the following error:

Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode

The java version used is:

java version "1.7.0_25"
OpenJDK Runtime Environment (IcedTea 2.3.10) (7u25-2.3.10-1ubuntu0.12.04.2)
OpenJDK 64-Bit Server VM (build 23.7-b01, mixed mode)
pirho
  • 11,565
  • 12
  • 43
  • 70
usb
  • 279
  • 1
  • 5
  • 15
  • possible duplicate of [What does "Could not find or load main class" mean?](http://stackoverflow.com/questions/18093928/what-does-could-not-find-or-load-main-class-mean) – Stephen C Jan 19 '14 at 03:03
  • My path is set. I can't figure out what's wrong. – usb Jan 19 '14 at 03:26
  • Don't they have prepackaged binaries? That is usually the way to go. – yǝsʞǝla Jan 19 '14 at 04:00
  • @AlekseyIzmailov - it isn't with Java applications. Certainly not these days. – Stephen C Jan 19 '14 at 04:16
  • I don't have Ubuntu here, but I have these packages on Fedora: `$ yum search hadoop` gives: `hadoop-client.noarch, hadoop-common.noarch, hadoop-hdfs.noarch, hadoop-mapreduce.noarch` and bunch of other things. – yǝsʞǝla Jan 19 '14 at 04:20
  • There is a PPA: sudo add-apt-repository ppa:hadoop-ubuntu/stable sudo apt-get update && sudo apt-get upgrade sudo apt-get install hadoop from http://askubuntu.com/questions/144433/how-to-install-hadoop. – yǝsʞǝla Jan 19 '14 at 04:23
  • @usb: Could you able to solve the issue? same issue for me – Josh Jun 02 '14 at 19:53
  • Turns out it was due to a bad Cloudera installation that I had attempted earlier. I did a fresh Ubuntu install and then followed http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/ step by step and it worked. 2.2.0 did not work properly for me. I used 1.2.1 for my project and it worked great. – usb Jun 03 '14 at 12:21
  • @AlekseyIzmailov - Those are packages ... not binaries. They might include binaries, but for a Java application the chances are that they don't. – Stephen C Feb 06 '15 at 07:17

10 Answers10

17

It is a problem of environmental variables setup. Apparently, I didnt find one which can work until NOW. I was trying on 2.6.4. Here is what we should do

export HADOOP_HOME=/home/centos/HADOOP/hadoop-2.6.4
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_CONF_DIR=$HADOOP_HOME
export HADOOP_PREFIX=$HADOOP_HOME
export HADOOP_LIBEXEC_DIR=$HADOOP_HOME/libexec
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH
export HADOOP_CONF_DIR=$HADOOP_PREFIX/etc/hadoop

Add these into your .bashrc and dont forget to do

source ~/.bashrc

I think your problem will be solved as was mine.

Somum
  • 2,382
  • 26
  • 15
7

You probably did not follow the instructions correctly. Here are some things to try and help us / you diagnose this:

  • In the shell that you ran hadoop version, run export and show us the list of relevant environment variables.

  • Show us what you put in the /usr/local/hadoop/etc/hadoop/hadoop-env.sh file.

  • If neither of the above gives you / us any clues, then find and use a text editor to (temporarily) modify the hadoop wrapper shell script. Add the line "set -xv" somewhere near the beginning. Then run hadoop version, and show us what it produces.

Stephen C
  • 698,415
  • 94
  • 811
  • 1,216
  • I get these on running export: declare -x CLASSPATH="/usr/local/hadoop/share/hadoop/common" declare -x HADOOP_COMMON_HOME="/usr/local/hadoop" declare -x HADOOP_HDFS_HOME="/usr/local/hadoop" declare -x HADOOP_INSTALL="/usr/local/hadoop" declare -x HADOOP_MAPRED_HOME="/usr/local/hadoop" declare -x HOME="/home/hduser" declare -x JAVA_HOME="/usr/lib/jvm/jdk/" declare -x PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/hadoop/bin:/usr/local/hadoop/sbin" declare -x YARN_HOME="/usr/local/hadoop" – usb Jan 19 '14 at 04:48
  • 1
    I only added "export JAVA_HOME=/usr/lib/jvm/jdk/" to hadoop-env.sh and rest is the way it was. – usb Jan 19 '14 at 04:50
  • Do you require the entire "hadoop-env.sh" file? – usb Jan 19 '14 at 04:55
  • @usb - Nothing "leaps out" from your responses to the first two bullets. Now please try the third one. And check that "/usr/local/hadoop/share/hadoop/common" contains the hadoop JAR files. – Stephen C Jan 19 '14 at 05:12
  • 1
    For what it's worth, I was facing the same issues and it turned out to be an improperly unpacked .tar file. Hope it helps anyone looking for a solution to this issue. – Lalo Sánchez Mar 08 '16 at 14:09
3

Adding this line to ~/.bash_profile worked for me.

export HADOOP_PREFIX=/<where ever you install hadoop>/hadoop

So just:

  1. $ sudo open ~/.bash_profile then add the aforesaid line
  2. $ source ~/.bash_profile

Hope this helps (:

chinglun
  • 637
  • 5
  • 18
3

I was facing the same issue. Although it may seem so simple but took away 2 hrs of my time. I tried all the things above but it didn't help.

I just exit the shell i was in and tried again by logging into the system again. Then things worked!

lambzee
  • 31
  • 1
1

Try to check:

  • JAVA_HOME, all PATH related variables in Hadoop config
  • run: . ~/.bashrc (note the dot in front) to make those variables available in your environment. It seems that the guide does not mention this.
yǝsʞǝla
  • 16,272
  • 2
  • 44
  • 65
  • 1
    I'm sorry but can you elaborate on that? I have checked .bashrc for my hduser and it has all the paths mentioned in the tutorial. Am I missing something? I also ran . ~/.bashrc – usb Jan 19 '14 at 03:48
  • I meant that in the tutorial after he does `$vi .bashrc` and sets all the variables he does not really run the file so variables would not be exported to your current session. You have to run it in the same terminal in which you run other commands afterwards like `hadoop`. Alternatively relogin or reboot. I'm just guessing, maybe there is another reason for this error. – yǝsʞǝla Jan 19 '14 at 03:53
  • Thanks for your help but I tried relogin and reboot. Doesn't seem to work. Also I can't find any other post with the same error so I think it should be something extremely trivial. – usb Jan 19 '14 at 03:55
  • You can also try to run `hadoop` command from this directory: `/usr/local/hadoop-2.2.0/share/hadoop/common/`. If there is a problem with classpath it might get fixed. – yǝsʞǝla Jan 19 '14 at 03:55
  • `hadoop` must be a shell script. Can you see how hadoop is invoked in it? Something along the lines of using `hadoop-common-2.2.0.jar` – yǝsʞǝla Jan 19 '14 at 03:58
  • Try also `hadoop classpath` – yǝsʞǝla Jan 19 '14 at 04:03
  • It's definitely a classpath issue. Java can't find JARs that have required classes. This could be either because classpath is incomplete (add more directories?) or because base path used in a script somewhere is wrong. Did you move the directory for example like in one of the first steps?: `sudo mv hadoop-2.2.0 hadoop`. I suggest you redo it from scratch :) – yǝsʞǝla Jan 19 '14 at 04:16
  • I get this on running hadoop classpath: /etc/hadoop/conf:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/.//*:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/*:/usr/lib/hadoop-hdfs/.//*:/usr/lib/hadoop-yarn/lib/*:/usr/lib/hadoop-yarn/.//*:/usr/lib/hadoop-0.20-mapreduce/./:/usr/lib/hadoop-0.20-mapreduce/lib/*:/usr/lib/hadoop-0.20-mapreduce/.//* – usb Jan 19 '14 at 04:52
  • I already did everything from scratch once :( Also, I have various versions of Java. But the default JVM is set to what the tutorial requires. – usb Jan 19 '14 at 04:53
  • I did move the directory, exactly as mentioned in the tutorial. – usb Jan 19 '14 at 04:54
  • That's strange, do you have `/usr/lib/hadoop/` directory? Does it have jar files? `find /usr/lib/hadoop/ -name "*.jar"`. I think it should be `/usr/local/hadoop/lib` instead (not sure). If the first one does not have jars and second does you will need to find where it sets this path in the startup scripts and change it. This could be it. – yǝsʞǝla Jan 19 '14 at 05:36
1

I got that error , I fixed that by editing ~/.bashrc as follow

export HADOOP_HOME=/usr/local/hadoop
export PATH=$HADOOP_HOME/bin:$PATH

then open terminal and write this command

source ~/.bashrc

then check

hadoop version
Elsayed
  • 2,712
  • 7
  • 28
  • 41
1

I got the same problem with hadoop 2.7.2 after I applied the trick shown I was able to start hdfs but later I discovered that the tar archivie I was using was missing some important pieces. So downloading the 2.7.3 everything worked as it is supposed to work.

My first suggestion is to download again the tar.gz at the same version or major.

If you are continuing to reading... this how I solved the problem... After a fresh install hadoop was not able to find the jars. I did this small trick:

I located where the jars are
I did a symbolic link of the folder to $HADOOP_HOME/share/hadoop/common

ln -s $HADOOP_HOME/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib $HADOOP_HOME/share/hadoop/common 

for version command you need hadoop-common-2.7.2.jar, this helped me to find where the jars where stored.

After that...

$ bin/hadoop version 
Hadoop 2.7.2
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r b165c4fe8a74265c792ce23f546c64604acf0e41
Compiled by jenkins on 2016-01-26T00:08Z
Compiled with protoc 2.5.0
From source with checksum d0fda26633fa762bff87ec759ebe689c
This command was run using /opt/hadoop-2.7.2/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/hadoop-common-2.7.2.jar

Of course any hadoop / hdfs command works now.

I'm again an happy man, I know this is not a polite solution but works at least for me.

ozw1z5rd
  • 3,034
  • 3
  • 32
  • 49
1

Here is how it works for Windows 10 Git Bash (mingw64):

export HADOOP_HOME="/PATH-TO/hadoop-3.3.0"
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_CLASSPATH=$(cygpath -pw $(hadoop classpath)):$HADOOP_CLASSPATH
hadoop version

copied slf4j-api-1.6.1.jar into hadoop-3.3.0\share\hadoop\common

caot
  • 3,066
  • 35
  • 37
0

I added the environment variables described above but still didn't work. Setting the HADOOP_CLASSPATH as follows in my ~/.bashrc worked for me:

export HADOOP_CLASSPATH=$(hadoop classpath):$HADOOP_CLASSPATH

Eduardo Sanchez-Ros
  • 1,777
  • 2
  • 18
  • 30
-2

I used

export PATH=$HADOOP_HOME/bin:$PATH

Instead of

export PATH=$PATH:$HADOOP_HOME/bin

Then it worked for me!

Hiren Gohel
  • 4,942
  • 6
  • 29
  • 48
Giri
  • 35
  • 5