How would this algorithm be changed to give a maximum spanning tree?
MST_prim(G,w,r)
for each u that exists in G.V
u.key= inf
u.pi=NIL
r.key=0
Q=G.V
While Q is not empty
u= EXTRACT-MIN(Q)
for each v in G.Adj[u]
if v is in Q and w(u,v)<v.key
v.pi=u
v.key=w(u,v)
I tried changing it so that it would give me
u = EXTRACT - MAX(Q) and w(u, v) > v.key
but I don't think that's correct.