I want to calculate the personalized page rank value of every node in a large dense graph, with over a million nodes. Currently I am using the networkx library and setting the personalization value as given in the answer in Using python's networkX to compute personalized page rank.
However, is there a more efficient way, in terms of time complexity, to do this? Thank you very much.
def findRank(graph, node):
try:
graph.has_node == False
except Exception as e:
print("No such node in the graph")
nodeList = list(graph.nodes)
nodeDict = dict()
for graphnode in nodeList:
if graphnode != node:
nodeDict[graphnode] = 0
else:
nodeDict[graphnode] = 1
rank = pagerank_numpy(
graph,
alpha=0.85,
personalization=nodeDict,
weight='weight',
dangling=None
)