I think it's silly to use parallelism, when a simple change to the algorithm can speed it up thousandfold. Assuming CategoryID
has a decent implementation of GetHashCode()
a dictionary/lookup has nearly O(1)
lookup times compared to the O(n)
time of scanning through a list.
Two possible approaches:
Turn categories into a dictionary
Assuming categories have unique IDs you can use:
var categoryById = categories.ToDictionary(c => c.CategoryID);
foreach(var product in products)
{
var category = categoryById[product.CategoryID];
category.Products.Add(product);
}
This has runtime O(products.Count + categories.Count)
and uses O(categories.Count)
memory.
If categories don't start out with an empty list, you might need to create it.
Turn products into a Lookup
var productsByCategory = products.ToLookup(product => product.CategoryID);
foreach(var category in categories)
{
category.Products = products[category.CategoryID].ToList();
}
This has runtime O(products.Count + categories.Count)
and uses O(products.Count)
memory.
Since there are typically more products than categories, this approach will require more memory. On the other hand the lookup might remove the need to embed a list of products in the category object.