0

This is my setup:

A dockerized environment, with separate docker containers for the RoR backend and the ReactJS frontend. I want to publish events to the AWS SQS Queue and I want to perform actions to MongoDB based on the messaged in the SQS. SQS has been setup to track events on two different buckets say bin and bout. What is the e

Flow of events:

  1. There is an upload event on bin -> Message1 published to SQS
  2. The receiver reads message1 off the SQS and decides to perform an action1 -> message2 publish to SQS
  3. action1 completed -> message3 published to SQS
  4. The receiver reads message3 and decided to update Mongo to reflect this action being performed.

My questions:

  1. I want the backend to take care of the receiver but I am not sure on how I would start implementing this, since I don't want to run a cron job on my dockerized container as it seems like a pain (here)
  2. Also I am planning to delete the messages once they have been processed/read.

Can someone give me an insight into this? LMK if I need to draw up a diagram to explain the flow. Thanks!

Community
  • 1
  • 1
premunk
  • 303
  • 1
  • 4
  • 18

1 Answers1

0

I'm not 100% if I understand correctly, but what your are trying to achive is simple workflow of tasks where some task can have take long time. All you have to do is create one or more Worker environments. Such environmet has demon automaticlly installed. This demon takes message from queue and send it to one POST REST action of your application. If you respond to that call with code 200 it will remove message from queue. If not it will retry few times and send message to dead letter queue (if it was configured).

So you have to develop one or more REST action in your application that process your messages (or create many applications). If you deside to create one application with one REST action then you have to send some information in message what type it is (messsage 1,2,3), becouse worker can have only one REST action entry point.

Number of queues is up to you, you could have one queue with all message types and one worker or you could have 2-3 queues with worker attached to each. It only depends on nature of tasks. If they take similiar time to perfom and you don want to process some message types faster than probably i would go with one queue and one worker and messages that say "I am message of type 1".

Workers can be created in different technologies (java, nodejs, python, ruby, some docker container), the only requirment is that applications has POST REST action to which message will be send

Panuf
  • 82
  • 1
  • 8