0

As a beginner in python, I started to implement an idea I already implemented in Matlab. I want to identify intersection points (x,y coordinates) of arrays, WITHOUT an explicit function given. That is, arrays with just int-values might have intersections with float-coordinates. Therefore the www presents the following code, see attachment.

After installing via pip scipy and interpolate, I can't solve the upcoming

ERROR: AttributeError: 'module' object has no attribute 'PiecewisePolynomial'.

Search function hasn't helped.

I would be very happy to receive some help by

  1. resolving the ERROR

or/and

  1. computing the intersection differently (is scipy the only way?)

PS: computing the intersection via difference on matlab was not expedient, due to uniqueness of intersections in some intervals

import scipy.interpolate as interpolate
import scipy.optimize as optimize
import numpy as np

x1=np.array([1.4,2.1,3,5.9,8,9,23])
y1=np.array([2.3,3.1,1,3.9,8,9,11])
x2=np.array([1,2,3,4,6,8,9])
y2=np.array([4,12,7,1,6.3,8.5,12])    

p1 = interpolate.PiecewisePolynomial(x1,y1[:,np.newaxis])
p2 = interpolate.PiecewisePolynomial(x2,y2[:,np.newaxis])

def pdiff(x):
    return p1(x)-p2(x)

xs=np.r_[x1,x2]
xs.sort()
x_min=xs.min()
x_max=xs.max()
x_mid=xs[:-1]+np.diff(xs)/2
roots=set()
for val in x_mid:
    root,infodict,ier,mesg = optimize.fsolve(pdiff,val,full_output=True)
    # ier==1 indicates a root has been found
    if ier==1 and x_min<root<x_max:
        roots.add(root[0])
roots=list(roots)        
print(np.column_stack((roots,p1(roots),p2(roots))))
IMHO
  • 3
  • 3
  • 1
    Newer versions of scipy don't seem to have that method, [`PPoly`](https://docs.scipy.org/doc/scipy-0.18.1/reference/generated/scipy.interpolate.PPoly.html#scipy.interpolate.PPoly) seems to be available instead (it has different syntax and semantics). – Andras Deak -- Слава Україні Nov 13 '16 at 23:42
  • Do you understand what the error is telling you? – wwii Nov 14 '16 at 00:14
  • Are you saying that you found some code on the web and you don't know how it works? – wwii Nov 14 '16 at 00:20
  • `scipy.interpolate.PiecewisePolynomial` was removed in scipy 0.18.0; it had been deprecated since version 0.14. See the section ["Backwards incompatible changes"](https://docs.scipy.org/doc/scipy/reference/release.0.18.0.html#backwards-incompatible-changes) in the [SciPy 0.18.0 Release Notes](https://docs.scipy.org/doc/scipy/reference/release.0.18.0.html). – Warren Weckesser Nov 14 '16 at 05:50
  • It would work the same if you replace `PiecewisePolynomial(...)` with `BPoly.from_derivatives(...)` – ev-br Nov 14 '16 at 16:06
  • (at)wwii 1) yes 2) no, i do understand (at)Warren Weckesser thanks for the information! @ev-br indeed Bpoly.from_derivatives(...) solved the ERROR, hence thank you for the hint. But as Andras Deak mentioned, it didn't do what i was expecting(=wrong intersections). Thus scipy.interpolate.interp1d is the way to go ... – IMHO Nov 15 '16 at 01:27

1 Answers1

3

As I noted in a comment, the method you were trying to use isn't available in newer versions of scipy, but it didn't do what you expected it to do anyway.

My suggestion is to use numpy.interp1 or scipy.interpolate.interp1d to construct a linear interpolator of your functions, then use fsolve as you did to find all possible intersections. Since fsolve (much like MATLAB's fzero) can only find a single intersection at a time, you indeed need to loop over sections in your data to look for intersections.

import scipy.interpolate as interpolate
import scipy.optimize as optimize
import numpy as np

x1 = np.array([1.4,2.1,3,5.9,8,9,23])
y1 = np.array([2.3,3.1,1,3.9,8,9,11])
x2 = np.array([1,2,3,4,6,8,9])
y2 = np.array([4,12,7,1,6.3,8.5,12])    

# linear interpolators
opts = {'fill_value': 'extrapolate'}
f1 = interpolate.interp1d(x1,y1,**opts)
f2 = interpolate.interp1d(x2,y2,**opts)

# possible range for an intersection
xmin = np.min((x1,x2))
xmax = np.max((x1,x2))

# number of intersections
xuniq = np.unique((x1,x2))
xvals = xuniq[(xmin<=xuniq) & (xuniq<=xmax)]
# note that it's bad practice to compare floats exactly
# but worst case here is a bit of redundance, no harm

# for each combined interval there can be at most 1 intersection,
# so looping over xvals should hopefully be enough
# one can always err on the safe side and loop over a `np.linspace`

intersects = []
for xval in xvals:
    x0, = optimize.fsolve(lambda x: f1(x)-f2(x), xval)
    if (xmin<=x0<=xmax
        and np.isclose(f1(x0),f2(x0))
        and not any(np.isclose(x0,intersects))):
        intersects.append(x0)

print(intersects)
print(f1(intersects))
print(f2(intersects))

Apart from a few runtime warnings from the more problematic sections of the algorithm, the above finds the two intersections of your functions:

plot of functions

Key steps are checking that the results from fsolve are new (not close to your previous roots), and that you did actually find a root at the given x0.

Alternatively, you could use the intervals defined in xvals, check the piecewise linear functions on each interval, and check analytically whether two lines with these parameters (I mean x1=xvals[i],x2=xvals[i+1], y11=f1[x1], y12=f1[x2] etc) have an intersection on a given segment. You will likely be able to vectorize this approach, you won't have to worry about stochasticity in your results, and you only have to watch out for possible singularities in the data (where np.diff(xvals) is small, and you were to divide by this).


1numpy.interp doesn't define an interpolator function, rather it directly computes the interpolated values on a grid. In order to do the same using this function, you would have to define a function that calls numpy.interp for the given x value. This would likely be less efficient than the above, due to the high number of function evaluations during the zero search.

  • Thank you Andras Deak for your great help. It worked for me. I'll analyze your code step by step, for a better understanding of python. I'm impressed by the speed of python, despite of runtime warnings(=less efficiency). I wonder, how the algorithm will work with far more data points and intersections, gonna check that. Big thanks ! – IMHO Nov 14 '16 at 10:43
  • @IMHO I'm glad I could help:) Some of the code might not be entirely idiomatic, but I couldn't come up with more elegant solutions. The warnings mostly tell you that the zero-search wasn't *good enough*, so it says little about speed. Especially for a lot of intersections, you should consider implementing the alternate route: looking at each interval in `xvals` and computing/checking the intersections analytically (without `fsolve`). Do you understand what I mean? – Andras Deak -- Слава Україні Nov 14 '16 at 12:20
  • 1
    @ Andras Deak I respect your modesty, but in my opinion you did a great job, so big thanks again. As a questioner I got a super fast and detailed answer and your hint to scipy.interpolate.interp1d & numpy.interp was worth gold for a python-beginner. I was not even thinking that newer versions of scipy don't seem to have my method. I still don't understand why methods should be deleted in a newer version without a hint to other methods, just an ERROR. Your code works even fine with ~4000 datapoints and ~300 intersections. I do understand your analytical description and will try to implement it. – IMHO Nov 15 '16 at 00:19
  • @IMHO thanks, and good luck:) As for the deletion of the function: [the release notes specify](https://docs.scipy.org/doc/scipy/reference/release.0.18.0.html#scipy-interpolate) that they are no longer available, and what the exact replacement is. I can see why the current documentation doesn't mention deprecated/removed features, but I know that google doesn't give you the newest version of the documentation:) – Andras Deak -- Слава Україні Nov 15 '16 at 00:25