5

By lock helpers I am referring to disposable objects with which locking can be implemented via using statements. For example, consider a typical usage of the SyncLock class from Jon Skeet's MiscUtil:

public class Example
{
    private readonly SyncLock _padlock;

    public Example()
    {
        _padlock = new SyncLock();
    }

    public void ConcurrentMethod()
    {
        using (_padlock.Lock())
        {
            // Now own the padlock - do concurrent stuff
        }
    }
}

Now, consider the following usage:

var example = new Example();
new Thread(example.ConcurrentMethod).Start();

My question is this - since example is created on one thread and ConcurrentMethod is called on another, couldn't ConcurrentMethod's thread be oblivious to _padock's assignment in the constructor (due to thread caching / read-write reordering), and thus throw a NullReferenceException (on _padLock itself) ?

I know that locking with Monitor/lock has the benefit of memory barriers, but when using lock helpers such as these I can't see why such barriers are guaranteed. In that case, as far as I understand, the constructor would have to be modified:

public Example()
{
    _padlock = new SyncLock();
    Thread.MemoryBarrier();
}

Source: Understanding the Impact of Low-Lock Techniques in Multithreaded Apps

EDIT Hans Passant suggests that the creation of a thread implies a memory barrier. So how about:

var example = new Example();
ThreadPool.QueueUserWorkItem(s => example.ConcurrentMethod());

Now a thread is not necessarily created...

Ohad Schneider
  • 36,600
  • 15
  • 168
  • 198
  • 1
    At what point in time do you think it might have a cached `null` floating around? – Marc Gravell Jul 04 '11 at 18:14
  • In addition to Marc: the `_padLock` ref doesn't change so caching is irrelevant. The first read will happen _after_ it is set. Your question would have more merit if it was create-on-demand or something. – H H Jul 04 '11 at 18:21
  • 1
    Starting a thread is in itself enough to force caches to be updated. You'll have to come up with a better example. – Hans Passant Jul 04 '11 at 18:29
  • @Marc, Henk - the ctor's thread can assign the synclock into its cache, not showing up in the main memory, which ConcurrentMethod's thread may then read – Ohad Schneider Jul 04 '11 at 18:40
  • @Hans - I was relying on Joseph Albahari's threading tutorial that states: "The following implicitly generate full fences: C#'s lock statement (Monitor.Enter/Monitor.Exit), All methods on the Interlocked class (we’ll cover these soon), Asynchronous callbacks that use the thread pool — these include asynchronous delegates, APM callbacks, and Task continuations, Setting and waiting on a signaling construct, Anything that relies on signaling, such as starting or waiting on a Task"... So starting a thread would get into the "signaling" category? – Ohad Schneider Jul 04 '11 at 18:44
  • so really, you're asking "at what point is it guaranteed to have flushed ctor initialisation into main memory" ? – Marc Gravell Jul 04 '11 at 18:45
  • It is a side effect of operating system code. That doesn't conveniently fit a C# pigeonhole. – Hans Passant Jul 04 '11 at 18:46
  • @Marc well I didn't think a ctor had different rules than any other method call in this regard – Ohad Schneider Jul 04 '11 at 18:50
  • @Hans - As if concurrent programming weren't hard enough :) How about a threadpool thread, then? see my edit – Ohad Schneider Jul 04 '11 at 18:55
  • 3
    It is no different. Waking up a tp thread still involves an internal synchronization that syncs the caches. So does any thread context switch. – Hans Passant Jul 04 '11 at 19:04
  • @Hans I see, thanks. If you'd like, I'll gladly accept your answer if you posted one – Ohad Schneider Jul 04 '11 at 19:29

1 Answers1

10

No, you do not need to do anything special to guarentee that memory barriers are created. This is because almost any mechanism used to get a method executing on another thread produces a release-fence barrier on the calling thread and an aquire-fence barrier on the worker thread (actually they may be full fence barriers). So either QueueUserWorkItem or Thread.Start will automatically insert the necessary barriers. Your code is safe.

Also, as a matter of tangential interest Thread.Sleep also generates a memory barrier. This is interesting because some people naively use Thread.Sleep to simulate thread interleaving. If this strategy were used to troubleshoot low-lock code then it could very well mask the problem you were trying to find.

Brian Gideon
  • 47,849
  • 13
  • 107
  • 150
  • 4
    +1 for 'Thread.Sleep also generates a memory barrier' - extremely interesting notion, if it is indeed true – Daniel Mošmondor Jul 04 '11 at 22:10
  • Very interesting indeed. So in most cases, I wouldn't need memory barriers in the ctor, since the thread that created the object would usually be the one working on it (or dispatching other threads to work on it, which will incur a memory barrier). Is that about right ? – Ohad Schneider Jul 05 '11 at 08:47