A set of points with coordinates x
and y
look like this. I want to construct a curve in the region below y = 0
of the form - a - np.exp(-(x - b)/c)
, where parameters a
, b
and c
are found by the condition that 90% of the points below y = 0
are enclosed by this line and the function in question.
I've written the following code to do this, but the minimize
function gives the initial guess as a result and I don't know what I'm missing.
from scipy.optimize import minimize
import numpy as np
def enclosed_points(params):
a, b, c = params
den = (y < 0).sum() # Calculate the number of points with y coordinate below y0
func = - a - np.exp(-(x - b)/c) # Calculate the value of the function for each x
num = ((y < 0) & (y > func)).sum() # Calculate the number of points with y coordinate
# below y0 and above the function
return np.abs(num/den - 0.9) # Return the absolute value of the difference between
# the ratio of num and den and the target number (0.9)
initial_guess = [0.1, 0.2, 1] # Dummy initial guess
result = minimize(enclosed_points, initial_guess)
Edit. Here I have uploaded a random sample of the whole data in npy format.