.NET 6 update: a PriorityQueue<TElement, TPriority>
class was introduced with .NET 6, which makes the implementation below mostly irrelevant.
Here is a basic PriorityQueue
class that you could use. It is based on a SortedSet<(TSource, TPriority, long)>
, with the long
having the purpose of differentiating the items with equal priority.
public class PriorityQueue<TSource, TPriority>
{
private readonly SortedSet<(TSource, TPriority, long)> _set;
private long _index;
public PriorityQueue(IComparer<TPriority> priorityComparer = null)
{
priorityComparer ??= Comparer<TPriority>.Default;
var comparer = Comparer<(TSource, TPriority, long)>.Create((x, y) =>
{
int result = priorityComparer.Compare(x.Item2, y.Item2);
if (result == 0) result = Comparer<long>.Default.Compare(x.Item3, y.Item3);
return result;
});
_set = new SortedSet<(TSource, TPriority, long)>(comparer);
}
public void Enqueue(TSource item, TPriority priority)
{
_set.Add((item, priority, _index++));
}
public bool TryDequeue(out TSource item)
{
if (_set.Count == 0) { item = default; return false; }
var min = _set.Min;
_set.Remove(min);
item = min.Item1;
return true;
}
}
The priority in your case is the cost of each item. Items with lower cost are dequeued first. Smaller TPriority
values denote higher priority.
Usage example:
var queue = new PriorityQueue<Item, Decimal>();
//...
queue.Enqueue(item1, item1.Price);
queue.Enqueue(item2, item2.Price);
//...
if (queue.TryDequeue(out var bestPriceItem))
Console.WriteLine(bestPriceItem);
else
Console.WriteLine("The warehouse is empty.");
The implementation above should be quite efficient with large numbers of items. The SortedSet<T>
is implemented internally as a binary search tree, with the basic operations having a decent O(log n) complexity. But it's not the optimal solution. If you want the best of the best, you could take a look at what may become the official Microsoft's implementation, which is now in the "api-approved" stage (source code). That one is based on a heap instead of a binary search tree, with the same O(log n) complexity, but smaller memory footprint and less constant overhead per operation.