I was reading about worst case time complexity for the Dijkstra algorithm using binary heap (the graph being represented as adjacency list).
According to wikipedia (https://en.wikipedia.org/wiki/Dijkstra%27s_algorithm#Running_time) and various stackoverflow questions, this is O((V + E) logV)
where E - number of edges, V - number of vertices. However I found no explanation as to why it can't be done in O(V + E logV)
.
With a self-balancing binary search tree or binary heap, the algorithm requires
Θ((E+V) logV)
time in the worst case
In case E >= V, the complexity reduces to O(E logV)
anyway. Otherwise, we have O(E)
vertices in the same connected component as the start vertex (and the algorithm ends once we get to them). On each iteration we obtain one of these vertices, taking O(logV)
time to remove it from the heap.
Each update of the distance to a connected vertex takes O(logV)
and the number of these updates is bound by the number of edges E so in total we do O(E)
such updates. Adding O(V)
time for initializing distances, we get final complexity O(V + E logV)
.
Where am I wrong?