0

I am trying to run simple map reduce example in hadoop, this is my main program :

    Configuration configuration = new Configuration();

    Job job = new Job(configuration, "conf");
    job.setMapperClass(MapClass.class);

    int numreducers = 1;

    job.setNumReduceTasks(numreducers);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(Text.class);


    FileInputFormat.addInputPath(job, new Path("/user/www/input"));
    FileOutputFormat.setOutputPath(job, new Path("/user/www/output/"));
    System.exit(job.waitForCompletion(true) ? 0 : 1);

I am using these libraries hadoop-2.0.0 and guava 14.0.1, this program gives exception :

Exception in thread "main" java.lang.IncompatibleClassChangeError: class com.google.common.cache.CacheBuilder$3 has interface com.google.common.base.Ticker as super class
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
    at com.google.common.cache.CacheBuilder.<clinit>(CacheBuilder.java:190)
    at org.apache.hadoop.hdfs.DomainSocketFactory.<init>(DomainSocketFactory.java:46)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:456)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:410)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:128)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2308)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:87)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2342)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2324)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:351)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:163)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:335)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:194)
    at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:368)

It seem to be library version mismatch problem .. how do i fix it.

zsxwing
  • 20,270
  • 4
  • 37
  • 59
bigData
  • 1,318
  • 4
  • 16
  • 27
  • Can you use guava 11.0.2 instead of 14.0.1? – zsxwing Jul 05 '13 at 01:54
  • Tried with guava 11.0.2 also didn't worked same exception. – bigData Jul 05 '13 at 02:43
  • Could you check the guava version in the runtime? This [link](http://stackoverflow.com/questions/3222638/get-all-classes-in-classpath) will help you get the classpath you are using. Then you can paste the classpath here. – zsxwing Jul 05 '13 at 03:20

1 Answers1

0

Checking this link might help you - click here

And, if it doesn't solve, may be you have redundant guava jars lying around in the classpath that's causing this exception.

Can you provide the list of files in your hadoop lib directory?

Community
  • 1
  • 1
SSaikia_JtheRocker
  • 5,053
  • 1
  • 22
  • 41