1

Problem

When my web application updates an item in the database, it sends a message containing the item ID via Camel onto an ActiveMQ queue, the consumer of which will get an external service (Solr) updated. The external service reads from the database independently.

What I want is that if the web application sends another message with the same item ID while the old one is still on queue, that the new message be dropped to avoid running the Solr update twice.

After the update request has been processed and the message with that item ID is off the queue, new request with the same ID should again be accepted.

Is there a way to make this work out of the box? I'm really tempted to drop ActiveMQ and simply implement the update request queue as a database table with a unique constraint, ordered by timestamp or a running insert id.

What I tried so far

I've read this and this page on Stackoverflow. These are the solutions mentioned there:

  • Idempotent consumers in Camel: Here I can specify an expression that defines what constitutes a duplicate, but that would also prevent all future attempts to send the same message, i.e. update the same item. I only want new update requests to be dropped while they are still on queue.

  • "ActiveMQ already does duplicate checks, look at auditDepth!": Well, this looks like a good start and definitely closest to what I want, but this determines equality based on the Message ID which I cannot set. So either I find a way to make ActiveMQ generate the Message ID for this queue in a certain way or I find a way to make the audit stuff look at my item ID field instead of the Message ID. (One comment in my second link even suggests using "a well defined property you set on the header", but fails to explain how.)

  • Write a custom plugin that redirects incoming messages to the deadletter queue if they match one that's already on the queue. This seems to be the most complete solution offered so far, but it feels so overkill for what I perceive as a fairly mundane and every-day task.

PS: I found another SO page that asks the same thing without an answer.

Community
  • 1
  • 1
Antares42
  • 1,406
  • 1
  • 15
  • 45

1 Answers1

3

What you want is not message broker functionality, repeat after me, "A message broker is not a database, A message broker is not a database", repeat as necessary.

The broker's job is get messages reliably from point A to point B. The client offers some filtering capabilities via message selectors but this is minimal and mainly useful in keeping only specific messages that a single client is interested in from flowing there and not others which some other client might be in charge of processing.

Your use case calls for a more stateful database centric solution as you've described. Creating a broker plugin to walk the Queue to check for a message is reinventing the wheel and prone to error if the Queue depth is large as ActiveMQ might not even page in all the messages for you based on memory constraints.

Tim Bish
  • 17,475
  • 4
  • 32
  • 42
  • Hm. Polling a database is actually exactly what we wanted to move away from, and (with `auditDepth` in mind) ActiveMQ seems to offer duplicate detection on ids, just not custom message properties. :-/ – Antares42 Jan 26 '15 at 19:23
  • 1
    The audit functionality of ActiveMQ is not meant for user customization as that would break things like duplicate suppression during client failover handling etc. You need to build you application logic outside the Broker, the Broker is not a database. – Tim Bish Jan 26 '15 at 20:30
  • Well, in that case I don't need ActiveMQ at all. Thanks for the advice. – Antares42 Jan 26 '15 at 20:31
  • 1
    I suggest you rename the project from ActiveMQ to ActiveMB then. :-/ – Antares42 Feb 02 '15 at 14:29