This is a subtle problem that requires careful analysis.
First off, the code posed in the question is pointless, because it does a null check on a local variable that is guaranteed to not be null. Presumably the real code reads from a non-local variable that may or may not be null, and may be altered on multiple threads.
This is a super dangerous position to be in and I strongly discourage you from pursuing this architectural decision. Find another way to share memory across workers.
To address your question:
The first version of the question is: does the ?.
operator have the same semantics as your version where you introduce a temporary?
Yes, it does. But we're not done.
The second question, that you did not ask, is: is it possible that the C# compiler, jitter, or CPU causes the version with the temporary to introduce an extra read? That is, are we guaranteed that
var tempList = someListThatCouldBeNull;
if (tempList != null)
tempList.Add(new object());
is never executed as though you wrote
var tempList = someListThatCouldBeNull;
if (tempList != null)
someListThatCouldBeNull.Add(new object());
The question of "introduced reads" is complicated in C#, but the short version is: generally speaking you can assume that reads will not be introduced in this manner.
Are we good? Of course not. The code is completely not threadsafe because Add
might be called on multiple threads, which is undefined behaviour!
Suppose we fix that, somehow. Are things good now?
No. We still should not have confidence in this code.
Why not?
The original poster has shown no mechanism which guarantees that an up-to-date value of someListThatCouldBeNull
is being read. Is it accessed under a lock? Is it volatile? Are memory barriers introduced? The C# specification is very clear on the fact that reads may be moved arbitrarily backwards in time if there are no special effects such as locks or volatiles involved. You might be reading a cached value.
Similarly, we have not seen the code which does the writes; those writes can be moved arbitrarily far into the future. Any combination of a read moved into the past or a write moved into the future can lead to a "stale" value being read.
Now suppose we solve that problem. Does that solve the whole problem? Certainly not. We do not know how many threads there are involved, or if any of those threads are also reading related variables, and if there are any assumed ordering constraints on those reads. C# does not require that there be a globally consistent view of the order of all reads and writes! Two threads may disagree on the order in which reads and writes to volatile variables happened. That is, if the memory model permits two possible observed orderings, it is legal for one thread to observe one, and the other thread to observe the other. If your program logic implicitly depends on there being a single observed ordering of reads and writes, your program is wrong.
Now perhaps you see why I strongly advise against sharing memory in this manner. It is a minefield of subtle bugs.
So what should you do?
- If you can: stop using threads. Find a different way to handle your asynchrony.
- If you cannot do that, use threads as workers that solve a problem and then go back to the pool. Having two threads both hammering on the same memory at the same time is hard to get right. Having one thread go off and compute something and return the value when it is done is a lot easier to get right, and you can...
- ... use the task parallel library or another tool designed to manage inter-thread communication properly.
- If you cannot do that, try to mutate as few variables as possible. Do not be setting a variable to null. If you're filling in a list, initialize the list with a threadsafe list type once, and then only read from that variable. Let the list object handle the threading concerns for you.