0

I have two parts to my application which both does massive amount of insert and update respectively and because or poor managemenent, there's deadlock.

I am using entity framework to do my insert and update.

The following is my code for my TestSpool program. The purpose of this program is to insert x number of records with a given interval.

using System;
using System.Linq;
using System.Threading;
using System.Transactions;
namespace TestSpool
{
    class Program
    {
        static void Main(string[] args)
        {
            using (var db = new TestEntities())
            {
                decimal start = 700001;
                while (true)
                {
                    using (TransactionScope scope = new TransactionScope())
                    {
                        //Random ir = new Random();
                        //int i = ir.Next(1, 50);
                        var objs = db.BidItems.Where(m => m.BidItem_Close == false);
                        foreach (BidItem bi in objs)
                        {
                            for (int j = 0; j <= 10; j++)
                            {
                                Transaction t = new Transaction();
                                t.Item_Id = bi.BidItemId;
                                t.User_Id = "Ghost";
                                t.Trans_Price = start;
                                t.Trans_TimeStamp = DateTime.Now;
                                start += 10;
                                db.Transactions.AddObject(t);

                            }
                            Console.WriteLine("Test Spooled for item " + bi.BidItemId.ToString() + " of " + 11 + " bids");
                            db.SaveChanges();
                        }

                        scope.Complete();
                        Thread.Sleep(5000);
                    }
                }
            }
        }
    }
}

The second part of the program is the testserverclass, the serverclass supposed to processed a huge amount of transactions from testspool and determined the highest amount of the transaction and update to another table.

using System;
using System.Linq;
using System.Transactions;
public class TestServerClass
{

    public void Start()
    {
        try
        {

            using (var db = new TestServer.TestEntities())
            {

                while (true)
                {
                    using (TransactionScope scope = new TransactionScope())
                    {
                        var objsItems = db.BidItems.Where(m => m.BidItem_Close == false);
                        foreach (TestServer.BidItem bi in objsItems)
                        {
                            var trans = db.Transactions.Where(m => m.Trans_Proceesed == null && m.Item_Id == bi.BidItemId).OrderBy(m => m.Trans_TimeStamp).Take(100);

                            if (trans.Count() > 0)
                            {
                                var tran = trans.OrderByDescending(m => m.Trans_Price).FirstOrDefault();

                                // TestServer.BidItem bi = db.BidItems.FirstOrDefault(m => m.BidItemId == itemid);
                                if (bi != null)
                                {
                                    bi.BidMinBid_LastBid_TimeStamp = tran.Trans_TimeStamp;
                                    bi.BidMinBid_LastBidAmount = tran.Trans_Price;
                                    bi.BidMinBid_LastBidBy = tran.User_Id;

                                }
                                foreach (var t in trans)
                                {
                                    t.Trans_Proceesed = "1";
                                    db.Transactions.ApplyCurrentValues(t);
                                }

                                db.BidItems.ApplyCurrentValues(bi);
                                Console.WriteLine("Processed " + trans.Count() + " bids for Item " + bi.BidItemId);
                                db.SaveChanges();

                            }


                        }
                        scope.Complete();

                    }

                }

            }


        }
        catch (Exception e)
        {
            Start();
        }
    }

}

However, as both application con-currently runs, it will go into deadlock pretty fast randomly either from the first test or server application. How do i optimised my code for both side to prevent deadlocks ? I am expecting huge amount of inserts from the testspool application.

Melvin
  • 377
  • 2
  • 7
  • 19
  • To start, can you remove the catch block from the server code for now and see what happens? As of now, you essentially have a recursive method with no base case for your server code.
    Also, in your client code, put the sleep after the transaction scope is finished (after the end of the using statement) and see how that affects the output?
    – Ameen Dec 26 '12 at 07:44
  • Without the try catch for server, both application hit errors and die . – Melvin Dec 26 '12 at 07:48

1 Answers1

1

Since they work on the same data and get in each others way, I believe the cleanest way to do this would be to avoid executing these two at the same time.

Define a global static variable, or a mutex or a flag of some kind, maybe on the database. Whoever starts executing raises the flag, other one waits for the flag to come down. When flag comes down the other one raises the flag and starts executing.

To avoid long wait times on each class you can alter both your classes to process only a limited amount of records each turn. You should also introduce a maximum wait time for the flag. Maximum record limit should be chosen carefully to ensure that each class finishes its job in shorter time than max wait time.

e-mre
  • 3,305
  • 3
  • 30
  • 46
  • the testspool is to simulate multiple random amount of inserts from a web interface at a given time frame . Therefore, it's unpredictable who goes first. it supposed to be who gets to bid the item first. – Melvin Dec 26 '12 at 08:48
  • TestSpool simulates random inserts from web interface and TestServerClass simulates bulk processing of records on the server level. Am I right? So you should raise the flag before you start bulk processing on the server. Bidding inserts will wait in a loop for the flag and continue when the flag goes down. This will avoid deadlocks. Waiting should happen before you even open a transaction and you should timestamp bid records before you put them in wait so you don't lose accurate time information. – e-mre Dec 26 '12 at 08:59
  • hi, if I used the flag method, do you think it's feasible that all the bids are inserted into a queue in a static class.. and with a while loop, if the flag is up, the server will process the bids ? – Melvin Dec 27 '12 at 00:54
  • 1
    Presuming that the exact time bid is placed by user is important for your business case, you should timestamp your bids before you put them in a queue of any kind. After that you can put the bids in a static queue or put them in a separate db table for later processing or you can simply make all bids wait in a "while loop" in their threads until the flag is down. If you choose the static queue or temp table approach you can control the order bids are processes after the flag is down, if you choose the "while loop" approach bids might come in any order after the flag is down but it would ... – e-mre Dec 27 '12 at 06:48
  • ... probably not matter since you timestamp them before putting in wait loop. Either case flag must be carefully implemented with a mutex, semaphore or lock of some kind to ensure thread safety. – e-mre Dec 27 '12 at 06:52
  • this is how I do it, i set a flag while insert and used a static queue object to hold all the transactions, when the flag is up, 10 records from the queue are processed and the flag is set to server to process, once the server processed the transactions, it will set the flag back to the transaction. so far, it is going well since the queue object is FIFO concept. – Melvin Dec 28 '12 at 06:01