3

I'm trying to run Hadoop (2.2.0) on my Windows 7 machine (yes, I know that it would be better to run it on Linux, but it is not an option at this moment). I followed instructions posted at http://ebiquity.umbc.edu/Tutorials/Hadoop/14%20-%20start%20up%20the%20cluster.html and http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html

Evetyhing went fine until I tried to start Hadoop. Every operation I try to run finishes with : Error: Could not find or load main class ... error.
For e.g. running

./hadoop version

end up with

Error: Could not find or load main class org.apache.hadoop.util.VersionInfo

It definitely looks like a problem with classpath. However, I have no idea how to solve it. I tried to set different environment variables, like $HADOOP_COMMON_HOME or $HADOOP_HOME but without luck.

Any ideas?

Manjunath Ballur
  • 6,287
  • 3
  • 37
  • 48
Jakub
  • 3,129
  • 8
  • 44
  • 63

6 Answers6

2

This error generally occurs because Hadoop takes your PC name as the default username, which generally contains blank spaces, which is not allowed.

To solve this a simple solution would be: Go to hadoop-2.7.1 -> etc -> hadoop -> hadoop-env.cmd Open this hadoop-env.cmd file with any editor such as Notedpad++ and then at the last line replace %USERNAME% with your name without blankspaces. Example:- set HADOOP_IDENT_STRING=TapasVashi

P.S Also look over throughout the file, there may be other places with %USERNAME%, replace that also with your username but without blankspace.

Tapas Vashi
  • 231
  • 2
  • 5
0

When you usually get this error message, either you are using the wrong Java version or the program was compiled with an older Java version.

You can check your version by opening the cmd (command prompt) and typing java -version.

Manjunath Ballur
  • 6,287
  • 3
  • 37
  • 48
Basim Khajwal
  • 946
  • 6
  • 6
0

Adding this line to ~/.bash_profile worked for me:

export HADOOP_PREFIX=/where_ever_you_install_hadoop/hadoop

FYI, I have the same answer to this post: Could not find or load main class org.apache.hadoop.util.VersionInfo

chinglun
  • 637
  • 5
  • 18
0

I've also been trying to get Windows 7 up and running with Hadoop. For me, the problem was Hadoop is passing CLASSPATH in Cygwin format

CLASSPATH=/cygdrive/c/foo:/cygdrive/c/bar

However, Java expects CLASSPATH in Windows format

CLASSPATH=c:\foo;c:\bar

Looking at hadoop-0.19.1 showed me how they handled this. You may insert the following statements into bin/hadoop, prior to where it calls Java at the end (and repeat for other java-invoking sh scripts)

cygwin=false
case "`uname`" in
CYGWIN*) cygwin=true;;
esac

if $cygwin; then
  echo Cygwin
  CLASSPATH=`cygpath -p -w "$CLASSPATH"`
  HADOOP_HOME=`cygpath -d "$HADOOP_HOME"`
  HADOOP_LOG_DIR=`cygpath -d "$HADOOP_LOG_DIR"`
  TOOL_PATH=`cygpath -p -w "$TOOL_PATH"`
fi

export CLASSPATH=$CLASSPATH
echo $CLASSPATH
exec "$JAVA" $JAVA_HEAP_MAX $HADOOP_OPTS $CLASS "$@"
Manjunath Ballur
  • 6,287
  • 3
  • 37
  • 48
Gerald Chu
  • 53
  • 2
  • 9
0

I had faced this problem myself. This is what solved the problem for me.

Add the following to ~/.bashrc file:

export HADOOP_CLASSPATH=$(cygpath -pw $(hadoop classpath)):$HADOOP_CLASSPATH

Note: You can install Hadoop2.2+ directly on Windows. You don't need Cygwin.

Manjunath Ballur
  • 6,287
  • 3
  • 37
  • 48
Bala
  • 1
  • 1
0

My problem was the Resource Manager(yarn) was not able to load Hadoop Libraries(jars). I solved this by updating the configurations. Added this to yarn-site.xml :

<property>
<name>yarn.application.classpath</name>
<value>C:/hadoop-2.8.0/share/hadoop/mapreduce/*,C:/hadoop-2.8.0/share/hadoop/mapreduce/lib/*,C:/Hadoop-2.8.0/share/hadoop/common/*,C:/Hadoop-2.8.0/share/hadoop/common/lib/*,
    C:/hadoop-2.8.0/share/hadoop/hdfs/*,C:/hadoop-2.8.0/share/hadoop/hdfs/lib/*,C:/hadoop-2.8.0/share/hadoop/yarn/*,C:/hadoop-2.8.0/share/hadoop/yarn/lib/*</value>
</property>

Please note that the paths used here can be relative according to your system.

Amardeep Kohli
  • 503
  • 3
  • 7