0

I have a table of Orders that will be processed by different services. So assume a service starts processing a batch of orders and before it is completed another service comes and starts processing the same batch because the first batch is not done yet and the Orders table is not updated to reflect that.

How do we make sure this doesn't happen? So if one service is processing this batch then they should be "locked" for the other service.

What's the best way to do this?

Thanks.

EDIT: Basically I am looking for something like this

    public void ProcessOrders(){
synchronized (this) {
    if(IS_Process_RUNNING){
        logger.error("ProcessOrders() is already running.");
        return;
    }
    IS_Process_RUNNING = TRUE;
}

try{
    DoSomeWork();
}
catch(Exception e){
    LogFatalError.logFatalError("Error processing orders.....", e);
}
finally{
    IS_Process_RUNNING = FALSE;
}

}

So if a process (first one) calls this method then is_process_running flag is set to false and it does what its supposed to do. And also sets the flag to true so a diff process when it comes here will exit immediately.

JohnnyCage
  • 175
  • 1
  • 1
  • 11
  • Check this http://stackoverflow.com/questions/939831/sql-server-process-queue-race-condition/940001#940001 – Yousuf Aug 14 '15 at 20:49
  • Something like this? [Threadsafe FIFO Queue/Buffer](http://stackoverflow.com/questions/12375339/threadsafe-fifo-queue-buffer)? – dbc Aug 14 '15 at 20:49
  • You can follow the scheduler and active object patterns, have a process scheduling orders processing, this process will be responsible for feeding order processing objects with orders, the order processor when finished can update the order to indicate it has been processed. – BhavO Aug 14 '15 at 20:56

1 Answers1

0
    private static readonly Object obj = new Object();
    lock (obj) { ..... }

This works! Thanks.

JohnnyCage
  • 175
  • 1
  • 1
  • 11