Right now in my basic event simulation engine, I simply just resort the list of event objects to update by their priorities every step of the simulation. I do this because new events can be created during event updates and are appended to the list and when an event expires, I just "swap and pop" it with the last event in the list for performance. Should I just be using two priority queues instead? It seems like the n log n of sorting every step is at least the same if not less costly than dequeing all the events (n log n?) putting each unexpired one in another list that is built into the priority queue for the next update step.
EDIT: I think it would be more apt to refer to the 'events' as 'processes' instead and the whole thing as more of a process scheduling simulation then. Each object in the queue has it's state updated in priority order and then only if it has expired (entered some sort of conclusion state) does it get discarded and not reinserted into the queue. This is how having just a single priority queue could be a problem; when an object is reinserted, it will still have the lowest priority and would just be pulled out again. I was considering using a second queue to insert all newly spawned process objects and ones that did not expire into, not considering priority, then I could just build heap and swap it with the active queue before the start of the next update cycle.