Suppose object a
has a very expensive hash function, and I wish to query a
in different dicts or sets. If I do it naively:
d1_res = d1[a]
d2_res = d2[a]
I'll have to do two hashes. What I wish is something like:
EDIT: The following code in the original question is wrong!
hashvalue = hash(a)
d1_res = d1.getitem(a, hashvalue=hash)
d2_res = d2.getitem(a, hashvalue=hash)
EDIT: This is the correct sample code
hashvalue = hash(a)
d1_res = d1.getitem(a, hashvalue=hashvalue)
d2_res = d2.getitem(a, hashvalue=hashvalue)
Thus I only need to do one hash. Is there any way for this? Or is there any underlying Python mechanism that prevents such interface?
EDIT: the message below is important
An easy solution seems to cache the hash result in the __hash__
method, but my example here is a simplified one. Actually, the hash function in my real case is not expensive (just int hash). But the hashing is carried out for a lot of times and I want to cut the expense. I'm writing a C/C++ extension so I'm looking for any possible performance improvement.
Thanks in advance.