to find intersection between 2 polynomials I tried to minimize sum of least-squares like in this code - first plotted correct solution, then printed result of minimization.
import numpy as np
import matplotlib.pyplot as plt
x = np.linspace(0,1,100) # ???????? [0,1]
def polyD(x):
return 1.115355004199118 - 1.597163991790283* x**1 + 0.6539311181514963* x**2
def polyS(x):
return -0.03070291735792586 + 0.1601011622660309* x**1 + 0.8530319920733438* x**2
root= 0.61 # x - suppose found somewhere earlier...
y_root= polyD(root) # y
print(root, y_root) # x=0.61 y=0.38441273827121725
plt.plot(root, y_root, 'yo', x, polyD(x), 'r-', x, polyS(x), 'b-', ms=20, )
plt.show()
####################
from scipy.optimize import minimize
x0= 0
res= minimize(lambda t: sum((polyD(x) - polyS(x))**2), x0)
print(res)
print(res.fun) # 36.59096676853359 ????????????
I can assume a little difference due to the algorithm, but here I see the difference of 10^2 order.
What is my mistake?? How can I correct it? If I should define some constraints to x=[0,1]? or how to get correct result about the intersection_point?
P.S. or, perhaps, it will be easier to solve with derivatives (minimizing to find extremum)? EDIT: partial_derivatives of constrained function(s) (Jacobian matrix) can be used to speed-up the code execution - here