I have made a spark streaming program that reads from a kafka source. After some transformations, I need to send data to two kafka producers and also to hbase.
I receive this data via spark streaming:
Customer1 21631512435 2 1449540003.803 1449540363.571 25566530 27670 1557041 19491 65664 1 197.26.8.142 197.31.74.208
Customer2 21631526589 4 1449540003.821 1339540363.565 25536520 27369 1545811 19487 65659 5 197.25.2.135 197.31.74.206
I want to make some transformations and send it to 2 kafka producers and also save a copy into hbase.
I found some examples here talking about sending data to kafka producers and saving to hbase, but my problem is that I don't have sbt or maven and I'm using spark shell (spark 1.3). I've found many problems with importing jars.
I'm already reading from kafka and saving it to hdfs. Can anyone help me complete this task?