0

I need to find the minimum distance from a point (X,Y) to a curve defined by four coefficients C0, C1, C2, C3 like y = C0 + C1X + C2X^2 + C3X^3

I have used a numerical approach using np.linspace and np.polyval to generate discrete (X,Y) for the curve and then the shapely 's Point, MultiPoint and nearest_points to find the nearest points, and finally np.linalg.norm to find the distance.

This is a numerical approach by discretizing the curve.

My question is how can I find the distance by analytical methods and code it?

LNiederha
  • 911
  • 4
  • 18
KansaiRobot
  • 7,564
  • 11
  • 71
  • 150
  • [Here](https://math.stackexchange.com/questions/2264702/shortest-distance-from-point-to-curve) is the analytical approach. – blunova Oct 26 '22 at 12:49
  • see [Is it possible to express "t" variable from Cubic Bezier Curve equation?](https://stackoverflow.com/a/60113617/2521214) – Spektre Oct 27 '22 at 06:55

2 Answers2

2

Problem definition

For the sake of simplicity let's use P for the point and Px and Py for the coordinates. Let's call the function f(x).

An other way to look at you're problem is that you're trying to find an x that minimzes the distance between the P and the point (x, f(x))

The problem can then be formulated as a minimization problem.

Find x that minimizes (x-Px)² + (f(x)-Py)²

(Not that we can drop the square root that should be there because square root is a monotonic function and doesn't change the optima. Some details here.)

Analytical solution

The fully analytical way to solve this would be a pen and paper approach. You can develop the equation and compute the derivative, see where they cancel out to find out where extremums are (This will be a lengthy process to do analytically. @Yves Daoust addresses it in his answer. Either do that or use a numerical solver for this part. For example a version of Newton's method should do). Then check if the extremums are maximums or minimums by computing the point and sampling a few points around to check how the function is evolving. From this you can find where the global minimum is and that gives you the x you're looking for. But developing this is content probably better suited for mathematics.

Optimization approach

So instead I'm gonna suggest a solution that uses numerical minimization that doesn't use a sampling approach. You can use the minimize function from scipy to solve the minimization problem.

from math import pow
from scipy.optimize import minimize

# Define function
C0 = -1
C1 = 5
C2 = -5
C3 = 6
f = lambda x: C0 + C1 * x + C2 * pow(x, 2) + C3 * pow(x, 3)

# Define function to minimize
p_x = 12
p_y = -7
min_f = lambda x: pow(x-p_x, 2) + pow(f(x) - p_y, 2)

# Minimize
min_res = minimze(min_f, 0)  # The starting point doesn't really matter here

# Show result
print("Closest point is x=", min_res.x[0], " y=", f(min_res.x[0]))

Here I used your function with dummy values but you could use any function you want with this approach.

LNiederha
  • 911
  • 4
  • 18
  • The analytical solution does not work for a cubic curve. –  Oct 27 '22 at 07:05
  • 1
    analytical solution is way more complex and less accurate than search/optimization for cubic curves as it need a lot of branches and computing really complicated complex domain expressions ... – Spektre Oct 27 '22 at 09:29
  • I would assume you're referring to the fact that I glazed over finding the roots of a 5th degree polynomial with "see where they cancel out to find out where extremums are". I'll be honest I didn't take the time to look into it and I forgot it doesn't necessarily have a closed form solution. We know there is at least on real root provided all coefficients in the problem are real. There are tons of numerical methods to find it. As for analytical solutions it it seems you have it covered in your answer. So I would say it works but it's just a pain to solve, am I wrong? – LNiederha Oct 27 '22 at 09:44
1

You need to differentiate (x - X)² + (C0 + C1 x + C2 x² + C3 x³ - Y)² and find the roots. But this is a quintic polynomial (fifth degree) with general coefficients so the Abel-Ruffini theorem fully applies, meaning that there is no solution in radicals.

There is a known solution anyway, by reducing the equation (via a lengthy substitution process) to the form x^5 - x + t = 0 known as the Bring–Jerrard normal form, and getting the solutions (called ultraradicals) by means of the elliptic functions of Hermite or evaluation of the roots by Taylor.


Personal note:

This approach is virtually foolish, as there exist ready-made numerical polynomial root-finders, and the ultraradical function is uneasy to evaluate.

Anyway, looking at the plot of x^5 - x, one can see that it is intersected once or three times by and horizontal, and finding an interval with a change of sign is easy. With that, you can obtain an accurate root by dichotomy (and far from the extrema, Newton will easily converge).

After having found this root, you can deflate the polynomial to a quartic, for which explicit formulas by radicals are known.