0

I just want to access Hadoop file system through a Java code but I constantly seem to get exception

public class hdfsClient {

public hdfsClient() {}

public void addFile(String source, String dest) throws IOException{
    Configuration conf = new Configuration();
    conf.addResource(new Path("/usr/local/hadoop/etc/hadoop/core-site.xml"));
    conf.addResource(new Path("/usr/local/hadoop/etc/hadoop/hdfs-site.xml"));
    FileSystem fs = null;
    try {
        fs = FileSystem.get(conf);
    } catch (Exception e) {
        System.out.println("Error in getting the fileSystem");
        e.printStackTrace();
    }
}

Now the main file is something like this

public class testMain {
public static void main(String[] args) throws Exception{
    // TODO Auto-generated method stub
    hdfsClient client = new hdfsClient();

    if (args[0].equals("add")) {
        if (args.length < 3) {
            System.out.println("Usage: hdfsclient add <local_path> " +
            "<hdfs_path>");
            System.exit(1);
        }

        client.addFile(args[1], args[2]);
    }
}

}

I created these files in eclipse and exported as JAR and then i use

java -jar <jarname> add <path in local system> <path in hadoop>

The exact command is

java -jar add.jar add /home/aman/test.txt /

I get the following error

org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4
at org.apache.hadoop.ipc.Client.call(Client.java:1113)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:124)
at crud.crud.hdfsClient.addFile(hdfsClient.java:28)
at crud.crud.testMain.main(testMain.java:16)

Any help i tried for two whole days but couldnt resolve the problem any help

PS : Output fromm jps

16341 Jps
14985 NameNode
20704 -- process information unavailable
15655 NodeManager
15146 DataNode
15349 SecondaryNameNode
15517 ResourceManager
Jonathan Hall
  • 75,165
  • 16
  • 143
  • 189
iec2011007
  • 1,828
  • 3
  • 24
  • 38
  • See http://stackoverflow.com/questions/23634985/error-when-trying-to-write-to-hdfs-server-ipc-version-9-cannot-communicate-with and the fifth post in http://hortonworks.com/community/forums/topic/server-ipc-version-9/. It sounds like you have a library conflict. – spork Jul 19 '15 at 14:47
  • You should see this http://stackoverflow.com/questions/31453336/exception-in-thread-main-org-apache-hadoop-ipc-remoteexception-server-ipc-ver/31483536#31483536 – Abdulrahman Jul 20 '15 at 09:27

2 Answers2

0

The problem is because of the version mismatch in the libraries that you used in the code. Remove all the libraries and add the corresponding libraries collected from your hadoop installation itself.

Amal G Jose
  • 2,486
  • 1
  • 20
  • 35
0

I found the solution, i used hadoop core dependency in the pom.xml file, and hadoop core was a part of hadoop 1.X package and the rest of the dependency was from hadoop 2.X hence there was a version conflict. Removing the hadoop core dependency solved the problem.

iec2011007
  • 1,828
  • 3
  • 24
  • 38