0

I'm using scipy.optimize.minimize with the Newton-CG (Newton Conjugate Gradient) method since I have an objective function for which I know the analytical Jacobian and Hessian. However, I need to add a regularization term R=exp(max(s)) based on the maximum value inside the array parameter "s" that being fit. It isn't entirely obvious to me how to implement derivatives for R. Letting the minimization algorithm do numeric derivatives for the whole objective function isn't an option, by the way, because it is far too complex. Any thoughts, oh wise people of the web?

AstroBen
  • 813
  • 2
  • 9
  • 20
  • Including `max` makes the objective function non-differentiable, so any method that relies on the gradient and Hessian will perform poorly. Some options: use a method independent of derivatives (Nelder-Mead), or use a smooth [softmax](https://stackoverflow.com/questions/34968722/softmax-function-python) function instead. –  Dec 05 '17 at 17:37
  • Or transform the unconstrained non-differentiable problem into an constrained differentiable one. (and *numeric derivatives for the whole objective function isn't an option, by the way, because it is far too complex* isn't necessarily a good reason to forbid this approach; but num-diff alone won't help as outlined by Ivan) – sascha Dec 05 '17 at 17:51
  • Thanks Crazy Ivan. That's what I was afraid of. I will use another regularization metric that is differentiable. – AstroBen Dec 06 '17 at 12:45

0 Answers0