Spark can be fed in many ways as it is explained in the documentation (like Kafka, Flume, Twitter, ZeroMQ, Kinesis or plain old TCP sockets). Does anybody know how to feed Spark Streaming from Amazon SQS?
Asked
Active
Viewed 3,481 times
8
-
Asking in the user mailing list of Spark might help. – Tathagata Das Nov 27 '14 at 00:07
-
Thanks, I supposed nobody else is responding here ... – antoneti Jan 23 '15 at 12:08
-
@antoneti what lib did you end up using? any recommendations? – skboro Aug 20 '20 at 06:47
1 Answers
8
There's a github project called spark-sql-receiver. It's been uploaded to the maven repository with the groupId of com.github.imapi artifactId of spark-sqs-receiver_2.10. It's currently on version 1.0.1. By the looks of the github project, it's being actively maintained as well. The following is some sample code shamelessly copied from the project's README.md file:
ssc.receiverStream(new SQSReceiver("sample")
.credentials(<key>, <secret>)
.at(Regions.US_EAST_1)
.withTimeout(2))

user2370813
- 166
- 1
- 8
-
1how can we achieve similar operation with pyspark? Is it even possible? Thank you. – OSK Jan 11 '18 at 12:12