2

I am trying to solve a problem of finding the roots of a function using the Newton-Raphson (NR) method in the C language. The functions in which I would like to find the roots are mostly polynomial functions but may also contain trigonometric and logarithmic.

The NR method requires finding the differential of the function. There are 3 ways to implement differentiation:

  • Symbolic
  • Numerical
  • Automatic (with sub types being forward mode and reverse mode. For this particular question, I would like to focus on forward mode)

I have thousands of these functions all requiring finding roots in the quickest time possible.

From the little that I do know, Automatic differentiation is in general quicker than symbolic because it handles the problem of "expression swell" alot more efficiently.

My question therefore is, all other things being equal, which method of differentiation is more computationally efficient: Automatic Differentiation (and more specifically, forward mode) or Numeric differentiation?

Dean P
  • 1,841
  • 23
  • 23
  • 1
    I might be missing something but if you only handle polynomial functions, it's straight forward to find the differential function. aX^n -> naX^n-1 – Support Ukraine Jun 07 '17 at 20:00
  • My apologies, post edited. They mostly contain polynomial but can contain other more complex expressions – Dean P Jun 07 '17 at 20:06
  • How to the trigs and logs appear? Are there trig functions of polynomials, or logarithms of polynomials, etc.? (e.g. is this the complete space of functions involving x, sin(a+b*x), cos(a+b*x) and log(a+b*x) under a finite number of addition and multiplication steps)? Or are they simple trigs, logs -- maybe the space of functions of powers of x, simple trigs, and simple logs complete under addition? – jwimberley Jun 07 '17 at 20:15
  • In short, yes to all of the above, i.e. the chain rule will need to be applied to evaluate it. That is, trig functions can contain polynomials, e.g. sin(2x^2 + 3x) and logarithmic may contain polynomial, e.g. ln|2x^2|. Correct, a complete space of functions involving polynomial, trig and log – Dean P Jun 07 '17 at 20:20
  • If you were using C++ you could create classes for the various function types (e.g. Polynomial, Sin, Cos, Log, Sum, Product, Composition) and specialize the value and derivative operators to apply whatever rule (e.g. chain rule for the derivative operator in Composition) is needed. Maybe doing something similar in plain C is possible, overloading functions on structs? – jwimberley Jun 07 '17 at 20:24
  • 1
    Identify pure polynomial functions and handle them as a special case as the differential function is easy to calculate. For more complex function I'm not sure but I guess I would start out trying the numerical approach – Support Ukraine Jun 07 '17 at 20:24
  • For clarification purposes, *"How do you have these thousands of functions stored?"* As a general proposition, symbolic differentiation is generally more computationally intensive from the standpoint of chain rule application and lookup of the trig derivatives. But it is really hard to say without some understanding of how you have the functions stored. @jwimberley covers the case for polynomials, but you would need some type of lookup/replacement table for the trig functions. There numeric seems like it would be the choice. Hard to say without a few examples. – David C. Rankin Jun 07 '17 at 20:55
  • "quickest time possible" is often a trade-off with accuracy. Knowing that relative importance is needed for a quality answer. – chux - Reinstate Monica Jun 07 '17 at 21:16
  • What is the "automatic" mode ? Isn't that an instance of a numerical method ? –  Jun 08 '17 at 22:23
  • @DavidC.Rankin and chux. To be honest, I don't have exact specifications on the storage method and what acceptable accuracy is required at this very stage. I will define this more clearly in the coming month and give better clarity. – Dean P Jun 09 '17 at 08:48

2 Answers2

3

If your functions are truly all polynomials, then symbolic derivatives are dirt simple. Letting the coefficients of the polynomial be stored in an array with entries p[k] = a_k, where index k corresponds to the coefficient of x^k, then the derivative is represented by the array with entries dp[k] = (k+1) p[k+1]. For multivariable polynomial, this extends straightforwardly to multidimensional arrays. If your polynomials are not in standard form, e.g. if they include terms like (x-a)^2 or ((x-a)^2-b)^3 or whatever, a little bit of work is needed to transform them into standard form, but this is something you probably should be doing anyways.

jwimberley
  • 1,696
  • 11
  • 24
  • My apologies, post edited. They mostly contain polynomial but can contain other more complex expressions – Dean P Jun 07 '17 at 20:06
  • @4386427 In C, yes; I wasn't writing this with any programming language in mind. I wish stackoverflow allowed LaTeX. – jwimberley Jun 07 '17 at 20:12
2

If the derivative is not available, you should consider using the secant or regula falsi methods. They have very decent convergence speed (φ-order instead of quadratic). An additional benefit of regula falsi, is that the iterations remains confined to the initial interval, which allows reliable root separation (which Newton does not).

Also note than in the case of numerical evaluation of the derivatives, you will require several computations of the functions, at best two of them. Then the actual convergence speed drops to √2, which is outperformed by the derivative-less methods.

Also note that the symbolic expression of the derivatives is often more costly to evaluate than the functions themselves. So one iteration of Newton costs at least two function evaluations, spoiling the benefit of the convergence rate.

  • Upvoted for the regula falsi suggestion - never heard of it. Im really curious as to what method the proprietary software packages use like Matlab – Dean P Jun 09 '17 at 08:45
  • @DeanP: most probably much more sophisticated methods with bracketing and handling of multiple roots. –  Jun 09 '17 at 08:48