2

I have a process that creates a new batch of jobs at a fixed interval (every minute) and I want to send them to kue for processing by another process.

Sometimes, the same job can be in different batches.

What happens if a job that was sent in a previous batch wasn't completed by the time it is sent again in a new batch ?

My understanding is the it will be treated as a new job and executed twice.

Is this correct, and is there a way to avoid this ?

Running Turtle
  • 12,360
  • 20
  • 55
  • 73

1 Answers1

1

One thing would be to trap the job completeevent and traverse the list of queued job (doing as explained in this excellent post)to remove a possible duplicate assuming that you can identify it.

I have never done that myself and, if you follow this route, be wary of race conditions: I wonder if it is possible that the duplicate job might be scheduled before you finish traversing the pending jobs (I do not know).

Hope this helps.

Community
  • 1
  • 1
tgo
  • 1,515
  • 7
  • 11