0

Trying to speed up a potential flow aerodynamic solver. Instead of calculating velocity at an arbitrary point using a relatively expensive formula I tried to precalculate a velocity field so that I could interpolate the values and (hopefully) speed up the code. Result was a slow-down due (I think) to the scipy.interpolate.RegularGridInterpolator method running on every call. How can I cache the function that is the result of this call? Everything I tried gets me hashing errors.

I have a method that implements the interpolator and a second 'factory' method to reduce the argument list so that it can be used in an ODE solver.

x_panels and y_panels are 1D arrays/tuples, vels is a 2D array/tuple, x and y are floats.

def _vol_vel_factory(x_panels, y_panels, vels):
# Function factory method
def _vol_vel(x, y, t=0):
    return _volume_velocity(x, y, x_panels, y_panels, vels)
return _vol_vel

def _volume_velocity(x, y, x_panels, y_panels, vels):
velfunc = sp_int.RegularGridInterpolator(
        (x_panels, y_panels), vels
        )
return velfunc(np.array([x, y])).reshape(2)

By passing tuples instead of arrays as inputs I was able to get a bit further but converting the method output to a tuple did not make a difference; I still got the hashing error.

In any case, caching the result of the _volume_velocity method is not really what I want to do, I really want to somehow cache the result of _vol_vel_factory, whose result is a function. I am not sure if this is even a valid concept.

Paul Wells
  • 161
  • 2
  • 8
  • Possible duplicate of [What is memoization and how can I use it in Python?](http://stackoverflow.com/questions/1988804/what-is-memoization-and-how-can-i-use-it-in-python) –  May 24 '16 at 01:45

1 Answers1

0

scipy.interpolate.RegularGridInterpolator returns a numpy array. This is not cacheable because it doesn't implement hash.

You can store other representations of the numpy array and cache that and then convert it back to a numpy array though. For details on how to do that look at the following.

How to hash a large object (dataset) in Python?

Community
  • 1
  • 1
gnicholas
  • 2,041
  • 1
  • 21
  • 32