1

I have made a spark streaming program that reads from a kafka source. After some transformations, I need to send data to two kafka producers and also to hbase.

I receive this data via spark streaming:

Customer1   21631512435 2   1449540003.803  1449540363.571  25566530    27670   1557041 19491   65664   1   197.26.8.142    197.31.74.208
Customer2   21631526589 4   1449540003.821  1339540363.565  25536520    27369   1545811 19487   65659   5   197.25.2.135    197.31.74.206

I want to make some transformations and send it to 2 kafka producers and also save a copy into hbase.

I found some examples here talking about sending data to kafka producers and saving to hbase, but my problem is that I don't have sbt or maven and I'm using spark shell (spark 1.3). I've found many problems with importing jars.

I'm already reading from kafka and saving it to hdfs. Can anyone help me complete this task?

Zied Hermi
  • 229
  • 1
  • 2
  • 11
  • Possible duplicate of [How to write to Kafka from Spark Streaming](http://stackoverflow.com/questions/31590592/how-to-write-to-kafka-from-spark-streaming) – adamdunson Apr 14 '17 at 15:34
  • Can you clarify what you're looking to accomplish? It sounds like you're already able to read from kafka and write to hbase, but you're having trouble writing to kafka. Is that correct? – adamdunson Apr 14 '17 at 15:42
  • no i'm not able to write foreachRdd to hbase and in the same time send it to kafka procuder (i'm using scala) – Zied Hermi Apr 14 '17 at 17:04

0 Answers0