I have this formula:
1 - e^(log(0.5) * (x / beta) ^ alpha )
with alpha and beta that are my variable that I have to find. x is a bunch of images (my data) and I can compare that formula's output with the ground truth that comes from a user test. Basically I can generate a loss function that I would like to minimize. For finding the best alpha and beta I tried to use tensorflow but gradient descent and other optimizers appear to fail as that function is not convex (I try different initial conditions). Is there a global optimization tool in python that I can use to solve this problem?