6

Suppose multiple threads execute periodically the DoWork() method below. Suppose that at some point two threads begin the execution of this method almost simultaneously, so that one of the two local timestamp object is one tick larger than the other.

ICollection collection = // ...

public void DoWork()
{
    DateTime timestamp = DateTime.Now;

    lock(collection.SyncRoot)
    {
        // critical section
    }
}

If the thread A is characterized by a timestamp equal to t1, while the thread B is characterized by a timestamp t2 equal to t1 + 1 tick, then the thread A will require first the access to the critical section.

How does .NET manage the access to the critical section by multiple threads? Does it put access requests in a queue, so that they are in chronological order? In other words, is the access to the critical section guaranteed according to the order of thread access requests?

leppie
  • 115,091
  • 17
  • 196
  • 297
enzom83
  • 8,080
  • 10
  • 68
  • 114
  • `In other words, is the access to the critical section guaranteed according to the order of thread access requests?` **NO** – L.B Sep 05 '12 at 17:21
  • It's mainly based on the priority assigned to the thread – Praveen Sep 05 '12 at 17:25
  • What do you mean 'thread A will require first the access to the critical section'? – MStodd Sep 05 '12 at 17:30
  • @MStodd, thread A requires access to the critical section at the time t1, and thread B requires access to the critical section at the time t2 > t1 – enzom83 Sep 05 '12 at 17:37

2 Answers2

6

There are absolutely no guarantees on order of threads execution and what thread obtains critical section first.

Note that even priority of a thread will not guarantee the order - different cores/CPUs can execute threads at different priorities at exactly the same time and any thread can reach and obtain critical section first.

Note 2: threads also can be scheduled for execution/wait at arbitrary moments of time, so the fact that 2 different operations in the same thread are next to each other does not mean they will be executed one after another without delay in between. In your case it means that thread A may be stopped as soon as it obtains time-stamp and thread B that is scheduled for execution some time later will easily get later time-stamp but gets to critical section first.

Alexei Levenkov
  • 98,904
  • 14
  • 127
  • 179
  • So we can not assume or guarantee that access to the critical section occurs in timestamp order. If I want to ensure that the processing of any data associated with the timestamp occurs in timestamp order, then I should use a strategy to sort the data to be processed. – enzom83 Sep 05 '12 at 17:53
  • @enzom83, not sure what " use a strategy to sort the data to be processed" means... Does not look like comment to what I say and does not feel like a question. – Alexei Levenkov Sep 05 '12 at 18:01
  • I would like to perform some tasks in the order of arrival and not in the order of queuing in the shared queue, but I will write another question in this regard, because it would be off topic here. – enzom83 Sep 05 '12 at 21:12
  • @enzom83. Makes sense. In simplest case as long as all read/write operation on your sorted queue are protected by a lock you should be fine. Separate question is good idea to validate your approach. – Alexei Levenkov Sep 05 '12 at 21:33
4

You are making a tall assumption, one that forever gets programmers in trouble with threads. A thread that obtains the timestamp first is most certainly not guaranteed to enter the lock first as well. Only the odds are high, they are not 100%. Threads get pre-empted by the operating system scheduler. Which can interrupt any thread, including one that just started executing the method call to Monitor.Enter(). Scheduling decisions then may well suspend A and allow B to obtain the lock first.

Nor does it take the scheduler to gum up the order. The core that executes A might not have the "collection" object reference in its data cache, stalling the core long enough while waiting for the memory bus to allow the core that executes B to race ahead. The word "race" is appropriate, making wrong assumptions here causes threading race bugs in your code.

The mechanism behind locks are implemented by the processor, the only entity that can ensure there is no "same time". Every multi-core cpu implements the Compare-and-swap atomic instruction. You can see the version used by .NET in this answer.

Community
  • 1
  • 1
Hans Passant
  • 922,412
  • 146
  • 1,693
  • 2,536
  • hi @Hans Passant Does compare-and-swap atomic instruction guarantee that only one thread accesses a particular memory location AT A TIME? –  Nov 02 '22 at 13:22