People often say "writing lock-free program is hard", "writing correct lock-free program is even harder", "when you do concurrent programming, you need to think in terms of transactions."
What does transaction exactly mean?
I understand that when executing a program, the operating system scheduler may let the program enter and exit the CPU many times, and the exact code location where the program is paused by the scheduler is indeterministic, which makes concurrent programming hard, since the execution flows among several threads may interleave in all kinds of ways.
So does transaction mean a single CPU instruction or instructions executed in the same CPU entry? For example, in the following code
x = x1 + x2 + x3;
is it possible that x1 + x2
is computed in one CPU entry, and the addition of (temporary for the sum of x1, x2) + x3
is computed in another CPU entry, and the assignment =
to x
is done in a third CPU entry?