I have an external Java Client that is connecting to HBase. We have loaded the data primarily through batch mapreduce: TableOutputFormat. The data has been successfully pushed to Hbasetable through that method. The Java Client has been used as a way to scan through the data and provide it through a REST API.
We have a requirement to add a Put through the REST API; however, it hangs as soon as we do a put. The same HBase Client can access HBaseAdmin object and create channels. We can run Get and Scan commands, but as soon as we run a Put command, it fails. Relevant code below
Configuration config = HBaseConfiguration.create(new org.apache.hadoop.conf.Configuration());
config.set("hbase.zookeeper.quorum", props.getProperty("hbase.zookeeper.quorum"));
config.set("hbase.zookeeper.property.clientPort", "2181");
config.set("zookeeper.znode.parent", props.getProperty("zookeeper.znode.parent", "/hbase"));
confResourceAsInputStream = this.getClass().getClassLoader().getResource("hbase-site.xml").openStream();
InputStream hdfsResourceStream = this.getClass().getClassLoader().getResourceAsStream("hdfs-site.xml");
InputStream coreSiteStream = this.getClass().getClassLoader().getResourceAsStream("core-site.xml");
config.addResource(confResourceAsInputStream);
config.addResource(hdfsResourceStream);
config.addResource(coreSiteStream);
HConnection connection = HConnectionManager.createConnection(config);
HTableInterface putTable = connection.getTable(tableName.getBytes(), null);
try {
putTable.put(put);
} finally {
putTable.flushCommits();
putTable.close();
}
RESULTS: I don't know why but HTableInterface putTable = connection.getTable(tableName.getBytes(), null); only works for gets. I had to change it to the following and it worked:
HTableInterface putTable = connection.getTable(Bytes.toBytes(tableName);