0

I have code in my ASP.NET site that must not be run by 2 threads or processes concurrently, so I put it in a lock. This works assuming there is only one process, and since 'Maximum Worker Processes' is set to 1 in IIS, I felt that this is reasonable. However I did an experiment that makes me wonder:

I created this action:

public void Test()
{
    for (int i = 0; i < 100; i++)
    {
        System.IO.File.AppendAllText(@"c:\tmp\d.txt", $"a: {i}\n");
        System.Threading.Thread.Sleep(1000);
    }
}

and called it from my browser. I then switch a: to b:, compiled it, and called it from another browser tab. Then I opened d.txt, and I saw something like this:

b: 31
a: 67
b: 32
a: 68
b: 33
a: 69

Clearly there are 2 processes running at the same time, and my lock will not be enough. What is the best method of ensuring that my piece of code is not run concurrently?

wezten
  • 2,126
  • 3
  • 25
  • 48
  • Open the file with a write lock on it, write to the open stream, and when done dispose of the stream which closes the file (just wrap the whole thing in a using block). What to do with the other thread that encounters the initial exception if the file is already open is not clear, that is up to your logic. You can't use a lock or even a mutex, these are not thread safe in a web farm architecture so if you ever want your site load balanced across multiple web servers that approach (lock or mutex) would not work. – Igor Mar 09 '17 at 13:32
  • @Igor my code does not open any files. This is just a test action to prove that 2 processes can run concurrently. – wezten Mar 09 '17 at 13:34
  • you say you put a lock in it, where is the lock in you test? – Dave Becker Mar 09 '17 at 13:34
  • 1
    `my code does not open any files` <= what do you think `System.IO.File.AppendAllText` does? – Igor Mar 09 '17 at 13:35
  • @igor he mean his "actual" code, writing files is just his test – Dave Becker Mar 09 '17 at 13:35
  • @DaveBecker The test simply proves that multiple processes exist at the same time. Obviously once there are multiple processes, a lock is useless, since you're locking on different objects. – wezten Mar 09 '17 at 13:35
  • "and my lock will not be enough" Where is your lock ? – Aristos Mar 09 '17 at 13:36
  • If this is just throw away code then you should better explain **why you want to synchronize your code and on what.** The whole point of the web architecture is that you **want** the server to handle multiple simultaneous requests. **Depending on the resource in contention there are different strategies to ensure synchronization**. So without that information there is no way to help you. – Igor Mar 09 '17 at 13:36
  • @Igor This is a test action! My real code does not open any files. – wezten Mar 09 '17 at 13:36
  • have a `static` lock then. – Dave Becker Mar 09 '17 at 13:36
  • What should happen to the second request? Should it be refused, or queued? In general, I'd look to move any logic like this *out* from the website. Get it running in some other context (e.g. windows service, etc) and then implement an appropriate model for servicing the requests. – Damien_The_Unbeliever Mar 09 '17 at 13:37
  • @DaveBecker static doesn't help between processes – wezten Mar 09 '17 at 13:37
  • no, it doesn't help in a web farm either come to think of it. – Dave Becker Mar 09 '17 at 13:38
  • @Igor My code reads a feed and populates a DB table from it. I clear the table first, read the feed, and insert the rows. – wezten Mar 09 '17 at 13:38
  • 1
    @wezten OK. If you actually want help explain what you're doing and give us some _actual_ sample code – ProgrammingLlama Mar 09 '17 at 13:39
  • Have you considered creating a WCF service that would use something like `InstanceContextMode.Single` and `ConcurrencyMode.Single`? – Dan Field Mar 09 '17 at 13:39
  • @wezten What database? SQL? MySql? Mongo? other? How is _anyone_ supposed to offer you help without any information? – ProgrammingLlama Mar 09 '17 at 13:39
  • @john sql server – wezten Mar 09 '17 at 13:39
  • `My code reads a feed and populates a DB table` <= there are many ways to synchronize DB write access. DB Transactions are one way, using row versions is another. Again, this requires more information like what RDBMS you are using, how you are doing this (stored procs, standard t-sql statements, using ado.net, etc), and what do you want to happen with the caller that comes in when the resource is currently in use. – Igor Mar 09 '17 at 13:40
  • http://stackoverflow.com/questions/3662766/sql-server-how-to-lock-a-table-until-a-stored-procedure-finishes is one result a quick Google search turns up. – ProgrammingLlama Mar 09 '17 at 13:41
  • @john There are multiple tables involved, and I do not want to lock them. They may be used by other actions at the same time. – wezten Mar 09 '17 at 13:43
  • Again, there are multiple ways to ensure database concurrency. Locking is one way but you can lock at various levels to ensure that read access is not locked or that readers grab the uncommited data even though there is a writer updating that same table. There is no way to know what is correct though without knowing the RDBMS (including version), how you are accessing the database, and the rules for callers that want to execute an update that come in while there is an update in progress. – Igor Mar 09 '17 at 13:45
  • @john I just want to stop the issue of multiple processes, without all the complications of locking and transactions. Hence the title of my question 'asp.net prevent multiple processes'. I am not asking for alternative advice based on my actual scenario. – wezten Mar 09 '17 at 13:45
  • @Igor my question is about how to prevent my code running concurrently, not about alternative solutions. – wezten Mar 09 '17 at 13:47
  • In short create a static lock object and use the lock statement. If there are multiple entry points on the same server use a named mutex. That is the fastest thing you can do but it does not scale at all, you will always be limited to 1 server which is why concurrency is not always something you fix with a couple lines of code. – Igor Mar 09 '17 at 13:48
  • @Igor static lock objects will not work for multiple processes, since each process has a different object instance. – wezten Mar 09 '17 at 13:49
  • @wezten Your test mechanism involves multiple servers. I'm assuming this is because you'll have multiple servers in production. Even semaphores, mutexes, etc. won't help you there. Dan Field's #1 answer is probably the way to go. – ProgrammingLlama Mar 09 '17 at 13:50
  • 1
    Did you not read the 2nd sentence? Read the whole comment before you respond please. `... If there are multiple entry points on the same server use a named mutex.` – Igor Mar 09 '17 at 13:50
  • @john why do you say it involves multiple servers? – wezten Mar 09 '17 at 13:52
  • @Igor I believe you edited the comment later. – wezten Mar 09 '17 at 13:52
  • @wezten "The test simply proves that multiple processes exist at the same time." <-- Why would you launch multiple copies of the same web service on one machine? – ProgrammingLlama Mar 09 '17 at 13:58
  • @john let's say i make a change and deploy it - you can see from my experiment that the old dll may still be running together with the new one. – wezten Mar 09 '17 at 13:59
  • @wezten Your test was performed by launching two separate IIS processes on the same system, correct? – ProgrammingLlama Mar 09 '17 at 14:03
  • @john I wrote exactly how I did my test. It is really simple - try it yourself. Make the action, call it, change it a bit, press F6 (to build), call it again. – wezten Mar 09 '17 at 14:04
  • @wezten OK, that was unclear from your initial description. A) that's an unlikely scenario in a production environment. B) You're making a service, which by definition, regardless of multiple builds, can be called from multiple places at once. C) I was simply trying to warn you that, on the basis that it looked like you might be intending a release to a multi-server environment, that mutex won't help there. – ProgrammingLlama Mar 09 '17 at 14:09

3 Answers3

2

It sounds like you're really looking for a queue or queue-like singleton service. Using regular ASP.NET WebAPI doesn't make that easy to do OOB. You could fiddle around with making it work, but instead I'd consider one of the following:

  1. Use an actual queue (MSMQ, Azure Service Bus Queue, Rabbit MQ, etc.) that your endpoints connect to, with a listener on that queue that does the actual work. That may not really give you the synchronous communication you want, but it would be sure to keep things ordered and single.
  2. Create a WCF service and decorate it with InstanceContextMode = InstanceContextMode.Single and ConcurrencyMode = ConcurrencyMode.Single. The WCF Framework will take care of making that work as a singleton for you. You could host it in a Windows Service instead of IIS, which would further make sure that multiple processes didn't and you'd have full control over the threading model.
Dan Field
  • 20,885
  • 5
  • 55
  • 71
1

For the case that you have here one way is to use mutex

    // The key can be part of the file name - 
    //   be careful not all characters are valid
    var mut = new Mutex(true, key);

    try
    {   
        // Wait until it is safe to enter.
        mut.WaitOne();

        // now call your file manipulation function
        Test();
    }
    finally
    {
        // Release the Mutex.
        mut.ReleaseMutex();
    }   
Aristos
  • 66,005
  • 16
  • 114
  • 150
  • Will the mutex be release even if IIS is restarted, and the threads aborted? – wezten Mar 09 '17 at 13:51
  • 1
    @wezten when you press to restart IIS or you change some file and the project will need to recompile, there is some time that you set on the pool, that IIS wait for your application to finish their work - or else is kill it. If you need some kind of this synchronization - you can use the global.asax to trigger and stop work that is inside of mutex and take long time - in any case the mutex will release. – Aristos Mar 09 '17 at 13:58
  • @wezten by default you must know that session is use mutex and lock your calls anyway - look at that answer: http://stackoverflow.com/questions/11629600/does-asp-net-web-forms-prevent-a-double-click-submission/11629664#11629664 – Aristos Mar 09 '17 at 14:00
0

The issue you described could be caused by either threads or processes.

Assuming you just want to solve this problem on a single machine, and you want to prevent multiple worker processes (as title implies) then you might just lock a file for the life of the worker process.

Eg. in Application_Start you can get the temp directory (System.IO.Path.GetTempPath), and then prepare a file name using a common identifier like the site name from IApplicationHost. Then lock the file with an exclusive lock (or die). I recommend putting something in the windows application before killing your worker process.

If/when your worker process dies, the file should be unlocked. You can even get it to delete itself using CreateFile and with FILE_FLAG_DELETE_ON_CLOSE.

Of course if your application might be hosted on multiple servers then you probably won't want to use this type of approach since it won't be sufficient.

David Beavon
  • 1,141
  • 9
  • 16