I am trying to learn spark + scala. I want to read from HBase, but without mapreduce. I created a simple HBase table - "test" and did 3 puts in it. I want to read it via spark (without HBaseTest which uses mapreduce). I tried to run the following commands on shell
val numbers = Array(
new Get(Bytes.toBytes("row1")),
new Get(Bytes.toBytes("row2")),
new Get(Bytes.toBytes("row3")))
val conf = new HBaseConfiguration()
val table = new HTable(conf, "test")
sc.parallelize(numbers, numbers.length).map(table.get).count()
I keep getting error - org.apache.spark.SparkException: Job aborted: Task not serializable: java.io.NotSerializableException: org.apache.hadoop.hbase.HBaseConfiguration
Can someone help me , how can I create a Htable which uses serialzable configuration
thanks