Levenberg-marquardt is a widely-used second-order optimization algorithm. In most cases, it outperforms first-order gradient optimization methods
The algorithm takes the advantage of second-order [partial] derivatives of a function with respect to its variables. In first-order optimization methods (gradient descent) the update equation is as follow:
However, a problem with these methods is that is fixed in advance and thus prevent taking into account the curvature of the function surface. Levenberg-Marquardt resolves this issue, to a great extent, by using second-order partial derivatives (hessian-matrix).