4

I'm new to Accumulo and trying to install v1.7 on a Cloudera VM.

I have Java 1.7 and HDP 2.2, and Zookeeper is currently running. I've mainly been trying to follow the INSTALL.md without incident and have configured Accumulo however get the following error when trying to initialise:

./bin/accumulo init
2016-02-23 09:24:07,999 [start.Main] ERROR: Problem initializing the class loade                                                                             r
java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.                                                                             java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces                                                                             sorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.accumulo.start.Main.getClassLoader(Main.java:68)
        at org.apache.accumulo.start.Main.main(Main.java:52)
Caused by: java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
        at org.apache.commons.vfs2.impl.DefaultFileSystemManager.<init>(DefaultF                                                                             ileSystemManager.java:120)
        at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.gene                                                                             rateVfs(AccumuloVFSClassLoader.java:246)
        at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.getC                                                                             lassLoader(AccumuloVFSClassLoader.java:204)
        ... 6 more
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFacto                                                                             ry
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at org.apache.accumulo.start.classloader.AccumuloClassLoader$2.loadClass                                                                             (AccumuloClassLoader.java:281)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 9 more
Exception in thread "Thread-0" java.lang.NoClassDefFoundError: org/apache/common                                                                             s/io/FileUtils
        at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.clos                                                                             e(AccumuloVFSClassLoader.java:406)
        at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader$Accu                                                                             muloVFSClassLoaderShutdownThread.run(AccumuloVFSClassLoader.java:74)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.io.FileUtils
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at org.apache.accumulo.start.classloader.AccumuloClassLoader$2.loadClass                                                                             (AccumuloClassLoader.java:281)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        ... 3 more

I've read other posts where this has been put down to a bad setting in accumulo-env.sh however as below I don't see what im missing

if [[ -z $HADOOP_HOME ]] ; then
   test -z "$HADOOP_PREFIX"      && export HADOOP_PREFIX=/usr/lib/hadoop
else
   HADOOP_PREFIX="$HADOOP_HOME"
   unset HADOOP_HOME
fi

# hadoop-2.0:
test -z "$HADOOP_CONF_DIR"       && export HADOOP_CONF_DIR="$HADOOP_PREFIX/etc/hadoop"
test -z "$ACCUMULO_HOME"         && export ACCUMULO_HOME="/etc/accumulo/accumulo-1.7.0"
test -z "$JAVA_HOME"             && export JAVA_HOME="/usr/java/jdk1.7.0_67-cloudera"
test -z "$ZOOKEEPER_HOME"        && export ZOOKEEPER_HOME=/usr/lib/zookeeper
test -z "$ACCUMULO_LOG_DIR"      && export ACCUMULO_LOG_DIR=$ACCUMULO_HOME/logs
if [[ -f ${ACCUMULO_CONF_DIR}/accumulo.policy ]]
then
   POLICY="-Djava.security.manager -Djava.security.policy=${ACCUMULO_CONF_DIR}/accumulo.policy"
fi

Furthermore I have the following in my general.classpaths

<property>
<name>general.classpaths</name>

<value>
  <!-- Accumulo requirements -->
  $ACCUMULO_HOME/lib/accumulo-server.jar,
  $ACCUMULO_HOME/lib/accumulo-core.jar,
  $ACCUMULO_HOME/lib/accumulo-start.jar,
  $ACCUMULO_HOME/lib/accumulo-fate.jar,
  $ACCUMULO_HOME/lib/accumulo-proxy.jar,
  $ACCUMULO_HOME/lib/[^.].*.jar,
  <!-- ZooKeeper requirements -->
  $ZOOKEEPER_HOME/zookeeper[^.].*.jar,
  <!-- Common Hadoop requirements -->
  $HADOOP_CONF_DIR,
  <!-- Hadoop 2 requirements --><!--
  $HADOOP_PREFIX/share/hadoop/common/[^.].*.jar,
  $HADOOP_PREFIX/share/hadoop/common/lib/(?!slf4j)[^.].*.jar,
  $HADOOP_PREFIX/share/hadoop/hdfs/[^.].*.jar,
  $HADOOP_PREFIX/share/hadoop/mapreduce/[^.].*.jar,
  $HADOOP_PREFIX/share/hadoop/yarn/[^.].*.jar,
  $HADOOP_PREFIX/share/hadoop/yarn/lib/jersey.*.jar,
  --><!-- End Hadoop 2 requirements -->
  <!-- HDP 2.0 requirements --><!--
  /usr/lib/hadoop/[^.].*.jar,
  /usr/lib/hadoop/lib/[^.].*.jar,
  /usr/lib/hadoop-hdfs/[^.].*.jar,
  /usr/lib/hadoop-mapreduce/[^.].*.jar,
  /usr/lib/hadoop-yarn/[^.].*.jar,
  /usr/lib/hadoop-yarn/lib/jersey.*.jar,
  --><!-- End HDP 2.0 requirements -->
  <!-- HDP 2.2 requirements -->
  /usr/hdp/current/hadoop-client/[^.].*.jar,
  /usr/hdp/current/hadoop-client/lib/(?!slf4j)[^.].*.jar,
  /usr/hdp/current/hadoop-hdfs-client/[^.].*.jar,
  /usr/hdp/current/hadoop-mapreduce-client/[^.].*.jar,
  /usr/hdp/current/hadoop-yarn-client/[^.].*.jar,
  /usr/hdp/current/hadoop-yarn-client/lib/jersey.*.jar,
  /usr/hdp/current/hive-client/lib/hive-accumulo-handler.jar
  /usr/lib/hadoop/lib/commons-io-2.4.jar
  <!-- End HDP 2.2 requirements -->
</value>
<description>Classpaths that accumulo checks for updates and class files.</description>

Any help would be appreciated, interestingly I get the same result when trying to run ./bin/accumulo classpath

jhole89
  • 718
  • 9
  • 28
  • Did you ever solve this issue? I'm having the exact same problem, can't run ./bin/accumulo classpath either. – xv70 Sep 29 '18 at 01:36
  • unfortunately not and I moved on to something new so I'll never know if it got solved – jhole89 Oct 02 '18 at 10:13

2 Answers2

1

Accumulo expects to pull the commons-io-2.4.jar from your Hadoop installation. I'm not sure whether CDH is packaging this jar at all, or if your configuration files are just not pointing at it correctly.

You can try examining the output of accumulo classpath to see what the items on the classpath actually expand to. The general.classpath configuration item in accumulo-site.xml is what you will want to inspect/modify.

elserj
  • 1,236
  • 6
  • 6
  • thanks elserj, I've edited my general.classpath to include this (see above) but still get the same issue. As per my original post (last line), I'm unable to examine the output of `accumulo classpath` as get the same error – jhole89 Feb 24 '16 at 09:17
  • Oops, sorry. I missed that last bit about being unable to run the classpath cmd :). You could try something like `find -L /usr/lib/hadoop -name 'commons-io-2.4.jar'` and compare that with the regex's in general.classpaths. – elserj Feb 24 '16 at 21:31
0

unset CLASSPATH

I had same problem took me hours to figure it out.

noe
  • 1