So there's a lot of talk about concurrency and multi-threading lately, and for good reason, but I'm having trouble seeing practical applications.
Like, .NET recently added ConcurrentDictionary
, ConcurrentBag
, ConcurrentStack
, ConcurrentQueue
, etc.
What are some concrete practical examples of when these would come in to play? Ideally I'd like some easily relate-able scenario where either the non-concurrent one would fail or the concurrent one would be much faster due to ease-of-making parallel.
I'm primarily interested in the ConcurrentDictionary
if that makes providing any examples easier. When I think about it, I don't understand how it being concurrent can help with speed.
Let's say you had a data source you wished to add to it. Wouldn't iterating over the data source across 8 threads distributed across 8 cores be of a similar to speed to just the one thread because in either case you're starting from one end and going to the other, and with the synchronous one there won't be any contention in adding data that has already been added?
Basically, I want to understand how concurrency can make programs faster and in what situations the non-concurrent implementation would fail, because I currently can't seem to think of many examples where they would be useful in this regard. (Primary language C#)