Consider
def f(x,*args):
intermediate = computationally_expensive_fct(x)
return do_stuff(intermediate,*args)
The problem: For the same x, this function might be called thousands of times with different arguments (other than x) and each time the function gets called intermediate would be computed (Cholesky factorisation, cost O(n^3)). In principle however it is enough if for each x, intermediate is computed only once for each x and then that result would be used again and again by f with different args.
My idea To remedy this, I tried to create a global dictionary where the function looks up whether for its parameter x the expensive stuff has already been done and stored in the dictionary or whether it has to compute it:
if all_intermediates not in globals():
global all_intermediates = {}
if all_intermediates.has_key(x):
pass
else:
global all_intermediates[x] = computationally_expensive_fct(x)
It turns out I can't do this because globals() is a dict itself and you can't hash dicts in python. I'm a novice programmer and would be happy if someone could point me towards a pythonic way to do what I want to achieve.