0

I am currently implementing a web service. Consider the following pseudo code (based on the WebApi) :

[HttpPost]
public UpdateUserAccount(UpdateUserAccountModel model) 
{
    1. Check Model is valid
       a) does user account have property x

    2. perform update
       a) apply property x to user account

    3. save changes
}

Now I know it is an unlikely scenario, but I feel I should deal with it anyway. If two calls are made in parallel to this function with the same parameters there is the possibility that the web service will be executing (2a) when the second call comes in. The second call will also pass validation as the initial call won't have saved changes yet and I will end up with unexpected results.

One solution I have thought of is to implement a lock on the users account

[HttpPost]
public UpdateUserAccount(UpdateUserAccountModel model) 
{
    1. Check Model is valid
       a) does user account have property x

    2. Check account lock
       a) does the users account have lock=true
       b) if true, returns HTTP error response

    3. Apply account lock
       a) set lock=true on DB entry + save

    4. perform update
       a) apply property x to user account

    5. save changes

    6. remove account lock
       a) set lock=false on DB entry + save
}

So now when the second call comes into the server, the lock will be set on the account and any issues around concurrent state will not surface.

Is this the only way I can solve this issue? I'm hesitant to implement it as I will need to do this for quite a few functions that I have on the web-service and it introduces other complications and im concerned an account might end up being permanently locked. That combined with the unlikelyhood of the API calls being timed to such an exact amount that this becomes a problem - although I wan't my service to be rock solid.

Patrick McCurley
  • 2,006
  • 1
  • 14
  • 24

2 Answers2

2

I would personally look at using the Entity Framework to commit your changes. The EF is transactionally aware and will keeps this kind of race condition from happening.

This is another post which answers the question.

Transactions in the Entity Framework

Community
  • 1
  • 1
  • Thanks for the link. I wasn't aware of AcceptAllChanges() that certainly seems useful. However, in the example above in 1a I would be using EF to check user account has property x. Even using transactions there is a scenario where calls in parallel would lead to property x being set twice. Unless I shared a single EF context across multiple server threads. But that's bad right? – Patrick McCurley May 24 '14 at 02:58
  • Here is another article relating to managing concurrency with the Entity Framework. This involved implementing Optimistic Locking [Optimistic Concurrency with the Entity Framework] (http://www.asp.net/mvc/tutorials/getting-started-with-ef-using-mvc/handling-concurrency-with-the-entity-framework-in-an-asp-net-mvc-application) – Robert Chumley May 24 '14 at 03:43
0

For concurrency you should use a timestamp/datetime so when a user submit changes you validate against the last time/datetime the record has been modified, even if two users try to save at the same time the second one saving will get an error about his information is old.

kondas
  • 193
  • 8
  • `code public string UpdateUserAccount(UserAccount userAccount) { // Retrieves the record from DataBase var serverUserAccount = GetUserAccountById(userAccount.Id); // ModifiedDateTime should be updated(on the server) before calling //this function if (userAccount.ModifiedDateTime > serverUserAccount.ModifiedDateTime) return UpdateUserAccountToDb(userAccount); // send the information //to Db and return "success" or "error etc, etc." else return "The information from the server is newer than the information edited."; } ` – kondas May 24 '14 at 04:02