0

I am new in using TPL in .Net applications. While creating a simple console application to achieve some parallel tasks those are dynamically created, I am stuck with some issues.

Problem here is that when 10 tasks are created and run, although the console is showing all the 10 tasks, when writing those into a log file after putting a delay between consoling and logging, the log file misses some of the items randomly.

Below is my sample code (This is just a skeleton of my actual code)

class Program
{
    public static int datacount = 10;

    static void Main(string[] args)
    {
        List<Task> tasks = new List<Task>();
        var s1 = DateTime.Now;
        var transList = GenerateTransactionList();

        foreach (var transaction in transList)
        {
            Transactions transactionNew = new Transactions();
            transactionNew = transaction;
            tasks.Add(Task.Factory.StartNew(() => serialMethod(transactionNew)));

        }
        Task.WhenAll(tasks).Wait();
        Console.WriteLine("Completed!!!");
    }

    private static List<Transactions> GenerateTransactionList()
    {
        Random r = new Random();
        List<Transactions> transactionList = new List<Transactions>();
        for (int i = 1; i <= datacount; ++i)
        {
            Transactions tr = new Transactions();
            tr.ID = 0;
            tr.Amount = r.Next(1, 10);
            tr.Created_By = "Iteration" + i;
            tr.Notes = "Iteration" + i;
            tr.Created_On = DateTime.Now;
            transactionList.Add(tr);
        }
        return transactionList;
    }
    private static async Task<string> serialMethod(Transactions tlist)
    {
        Console.WriteLine("Started Serial Iteration" + tlist.Notes);
        try
        {
            Console.WriteLine("Finished Serial Iteration" + tlist.Notes);
            Thread.Sleep(10000);//doing some time consuming process
            WriteLog("Parallel2", DateTime.Now, DateTime.Now, tlist.Notes);

            return "Success";
        }
        catch (Exception ex)
        {
            Console.WriteLine("serialmethod" + ex.Message);
            return "Failure";
        }
    }

    public static void WriteLog(string type,
        DateTime startTime, DateTime endTime,
        string dataSet)
    {
        try
        {
            string logFolderPath = AppDomain.CurrentDomain.BaseDirectory + @"\Logs";

            if (!Directory.Exists(logFolderPath))
                Directory.CreateDirectory(logFolderPath);

            string logFilePath = logFolderPath + @"\Log_" + DateTime.Today.ToString("yyyy.MM.dd") + ".csv";

            string line = string.Empty;
            if (!File.Exists(logFilePath))
            {
                line = @"""Type"",""Start Time"",""End Time"",""Duration"",""Iteration""";
                writeLineToFile(logFilePath, line);
            }

            string duration = (endTime - startTime).ToString();

            line = "\"" + type + "\"," +
                   "\"" + startTime.ToString("MM/dd/yyyy hh:mm:ss tt") + "\"," +
                   "\"" + endTime.ToString("MM/dd/yyyy hh:mm:ss tt") + "\"," +
                    "\"" + duration + "\"," +
                   "\"" + dataSet + "\"";

            writeLineToFile(logFilePath, line);

        }
        catch (Exception)
        {
            //do nothing
        }
    }

    private static void writeLineToFile(string fileName, string line)
    {

        using (var writer = new StreamWriter(fileName, true))
        {
            writer.WriteLine(line);
        }
    }
}

class Transactions
{
    public int ID { get; set; }
    public decimal Amount { get; set; }
    public int Points { get; set; }
    public string Notes { get; set; }
    public string Created_By { get; set; }
    public DateTime Created_On { get; set; }

}

Do you have any idea why this is happening. I have tried using ConcurrentBag instead of list. But that too is not helping. Please guide and let me know if I am missing anything or my implementation is completely wrong.

VMAtm
  • 27,943
  • 17
  • 79
  • 125
Ajib
  • 21
  • 4
  • Please see https://stackoverflow.com/questions/19304209/streamwriter-multi-threading-c-sharp – Peter Bons Mar 28 '18 at 13:15
  • Thanks for the info @PeterBons. As indicated in my sample code in the line Thread.Sleep(10000);//doing some time consuming process, I will be having some multiple database operations which I need to work as parallel. So, is it possible to accomplish it using threads? If yes, how should I do? – Ajib Mar 28 '18 at 15:39

1 Answers1

0

There a re a bunch of error-prone lines in your code:

  1. You're overriding the reference for transaction in your foreach loop
  2. You're using StartNew method instead of Tas.Run
  3. You're using blocking WaitAll instead of await WhenAll, so you do block one thread in your application for no reason
  4. You can simply switch to Parallel.Foreach instead of foreach
  5. And most important: you're writing to the same file from different threads simultaneously, so they are basically interrupting each other. Either use some blocking to write the file (which cannot be done in parallel) or use some library for logging, like NLog or whatever, so it will handle logging for you
  6. Your threads can run into situation when some of them trying to create file when other already done that, so move out the creation logic for file into one place (which the libraries like NLog will do for you properly)
  7. Try to use object initializers instead of setting one property after another:

    var tr = new Transactions
    {
        ID = 0,
        Amount = r.Next(1, 10),
        Created_By = "Iteration" + i,
        Notes = "Iteration" + i,
        Created_On = DateTime.Now
    }
    
VMAtm
  • 27,943
  • 17
  • 79
  • 125
  • Thank you for the detailed analysis. Actually, what I am trying to achieve is to do some parallel database operations (e.g: insertion of some records in parallel from a list one by one after doing some modifications to each object in the list). So, while doing this many records are getting missed. Tried with both tasks and foreach.parallel. In foreach.parallel, same records are being processed again from the list instead of distinct ones. Is there any solution to achieve that? – Ajib Mar 29 '18 at 05:59
  • In that case you definitely need a concurrent bag. Also, please add some information to your question, right now it hard to say what's wrong in your code – VMAtm Mar 29 '18 at 14:32