1

I am using C# and I have an enumerator and I am reading the data inside the enumerator sequentially.

This is a third party library object and does not support Parallel.Foreach

while(enumerator.Next())
{
  var item = enumerator.Read();
  ProcessItem(item);
}

ProcessItem(Item item)
{
// Is lock required here
 if(item.prop == "somevalue")
   this._list.Add(item);
}

I want to achieve multithreading here while reading the content.

while(enumerator.Next())
{
  // This code should run in a multi-threaded way

  var item = enumerator.Read();
  // ProcessItem method puts these items on a class level list property 
  // Is there any Lock required?
  ProcessItem(item);
}

I am new to multithreading. Please share any code samples which satisfies the above requirement.

Ankur Arora
  • 194
  • 3
  • 15
  • refer this https://www.codeproject.com/Articles/56575/Thread-safe-enumeration-in-C – Shiwanka Chathuranga Feb 02 '18 at 08:29
  • do you mean `ProcessItem(item)` does something like `_list.Add(item)`? depending on the `_list` type it requires locking. if you expand some details it will help (show the code for ProcessItem) – Ashkan Nourzadeh Feb 02 '18 at 08:33
  • Yes, based on some condition, it adds some of the items in the list. For eg: If(item.prop == "something") then _list.Add(item); – Ankur Arora Feb 02 '18 at 08:36
  • The example from codeproject is probably not what you need. I would not recommend to use it, not only because it breach the Gendarme `DoNotUseLockedRegionOutsideMethodRule` rule *(and not only this rule)*. As far as I understand this, I would use custom implementation of `IEnumerator` with lock for getting the objects. But it depends also on real implementation of `ProcessItems`, like stated by Ashkan Nourzadeh. With the request to change the list within it's enumeration the code can be really complex and I would say, you should search another way, without the change of list. – Julo Feb 02 '18 at 08:37
  • Possible duplicate of [When to use a Parallel.ForEach loop instead of a regular foreach?](https://stackoverflow.com/questions/12251874/when-to-use-a-parallel-foreach-loop-instead-of-a-regular-foreach) – Cleptus Feb 02 '18 at 08:38
  • I wonder if .NET built-in functions for parallel loops don't fit your needs. Check Parallel.ForEach (shipped in .NET 4.0) – Cleptus Feb 02 '18 at 08:40
  • This is a third party object and does not support Parallel.Foreach – Ankur Arora Feb 02 '18 at 08:46
  • Somebody downvoted the question by giving some links reference but does not satisfy the requirement – Ankur Arora Feb 02 '18 at 08:52
  • @AnkurArora check the edited [answer](https://stackoverflow.com/a/48578839/2669438) – Ashkan Nourzadeh Feb 02 '18 at 09:23
  • What do you mean by `does not support Parallel.ForEach`? `Parallel.ForEach` can accept *any* `IEnumerable`. This object does not have to be thread-safe and most of the time it is not. – usr Feb 02 '18 at 10:34
  • As it is a third party library, I could see just metadata and noticed that they haven't implemented the IEnumerable interface. I liked the idea of wrapping that in our custom enumerator and then using TPL – Ankur Arora Feb 02 '18 at 11:18

2 Answers2

2

This is a good example for task-based parallelization. Each processing of an item corresponds to a task. Hence, you can change the loop to the following:

  var tasks = new List<Task<int>>();
  while(enumerator.MoveNext())
  {
    var item = enumerator.Current;

    Task<int> task = new Task<int>(() => ProcessItem(item));
    task.Start();
    tasks.Add(task);
  }


  foreach(Task<int> task in tasks)
  {
    int i = task.Result;
    classList.Add(i);
  }

Note that the synchronization on the classList is implicitly given by first spawning all tasks in the while loop and then merging the results in the foreach loop. The synchronization is specifically given by the access to Result which waits until the corresponding task is finished.

Kristof U.
  • 1,263
  • 10
  • 17
2

Yes, some locking required. you can achieve it using lock or using a concurrent collection type.

using lock:

ProcessItem(Item item)
{
    if(item.prop == "somevalue")
    {
        lock(_list)
        {
            _list.Add(item);
        }
    }
}

Edit: based on detail you provided, you can wrap the enumerator from external lib using your own enumerator like below so you can use Parallel.ForEach on it:

We assume the enumerator you got is something like MockEnumerator, we wrap it in a normal IEnumerator, and IEnumerable so we are able to use Parallel.ForEach to read in parallel.

class Program
{
    class Item
    {
        public int SomeProperty { get; }

        public Item(int prop)
        {
            SomeProperty = prop;
        }
    }

    class MockEnumerator
    {
        private Item[] _items = new Item[] { new Item(1), new Item(2) };
        private int _position = 0;

        public bool Next()
        {
            return _position++ < _items.Length;
        }

        public Item Read()
        {
            return _items[_position];
        }
    }

    class EnumeratorWrapper : IEnumerator<Item>, IEnumerable<Item>
    {
        private readonly MockEnumerator _enumerator;

        public EnumeratorWrapper(MockEnumerator enumerator)
        {
            this._enumerator = enumerator;
        }

        public Item Current => _enumerator.Read();

        object IEnumerator.Current => Current;

        public void Dispose()
        {
        }

        public IEnumerator<Item> GetEnumerator()
        {
            throw new NotImplementedException();
        }

        public bool MoveNext()
        {
            return _enumerator.Next();
        }

        public void Reset()
        {
        }

        IEnumerator IEnumerable.GetEnumerator()
        {
            return this;
        }
    }

    private static List<Item> _list = new List<Item>();

    static void Main(string[] args)
    {
        var enumerator = new EnumeratorWrapper(new MockEnumerator());
        Parallel.ForEach(enumerator, item =>
        {
            if (item.SomeProperty == 1)//someval
            {
                lock (_list)
                {
                    _list.Add(item);
                }
            }
        });
    }
}
Ashkan Nourzadeh
  • 1,922
  • 16
  • 32