2

I searched a lot about the following issue that am facing:

java.io.IOException: Call to /10.0.1.37:50070 failed on local exception: java.io.EOFException at org.apache.hadoop.ipc.Client.wrapException(Client.java:1139) at org.apache.hadoop.ipc.Client.call(Client.java:1107) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226) ....

I found links like: What is the meaning of EOF exceptions in hadoop namenode connections from hbase/filesystem? and others, but none of them worked for me.

Now I am starting to feel that I am not understanding the version compatibility issues better. What confuses me the most is Hbase documentation about Hadoop compatibility which goes like "This version of Hbase will run only on Hadoop 0.20". What does 'this' refer to here? Do they mean 0.93-snapshot(which is at the top of the documentation)?

Finally, I am using Hadoop version 0.20.203 and Hbase 0.90.4. Can some one tell me if these versions are compatible?

Thanks in advance!!

Community
  • 1
  • 1
Gowtham
  • 21
  • 2

2 Answers2

1

I agree that the book gives a weird reference talking about "this version" and talks also about "0.93". To make things a bit more clear, the book currently transcends versions but lives only in trunk which is currently called 0.93 (and compile it adds -snapshot).

In any case, all HBase versions are currently compatible with all Hadoop 0.20.* be it 0.20.2 or 0.20.205.0., and the latter is the only one right now that supports appends. The version you are using, 0.20.203, does not and you can lose data if a region server dies.

Your EOF exception is probably because you didn't properly swap the Hadoop jars in your HBase lib/ folder. I answered a similar question on the mailing list yesterday EOFException in HBase 0.94 (it was mistitled 0.94, it should have been 0.90.4) which gives other clues on debugging this.

Finally, your stack trace has a weird port number in it. 50070 is the web UI, not the Namenode RPC port which by default is 9000. It could be that you are giving HBase the wrong port number.

-1

I took inputs from links posted and it worked for me. Only I had to copy an additional guava*.jar found in $HADOOP_HOME/lib into $HBASE_HOME/lib (using hadoop-0.20.2)