8

A while back a friend wanted help with a program that could solve for the roots of functions using Newton's method, and naturally for that I needed some way to calculate the derivative of a function numerically, and this is what I came up with:

deriv f x = (f (x+h) - f x) / h where h = 0.00001

Newton's method was a fairly easy thing to implement, and it works rather well. But now I've started to wonder - Is there some way I could use this function to solve partial derivatives in a numerical manner, or is that something that would require a full-on CAS? I would post my attempts but I have absolutely no clue what to do yet.

Please keep in mind that I am new to Haskell. Thank you!

Ben DalFavero
  • 161
  • 1
  • 7

2 Answers2

11

You can certainly do much the same thing as you already implemented, only with multivariate perturbation instead. But first, as you should always do with top-level functions, add a type signature:

deriv :: (Double -> Double) -> Double -> Double

That's not the most general signature possible, but probably sufficiently general for everything you'll need. I'll call

type ℝ = Double

in the following for brevity, i.e.

deriv :: (ℝ -> ℝ) -> ℝ -> ℝ

Now what you want is, for example in ℝ²

grad :: ((ℝ,ℝ) -> ℝ) -> (ℝ,ℝ) -> (ℝ,ℝ)
grad f (x,y) = ((f (x+h,y) - f (x,y)) / h, (f (x,y+h) - f (x,y)) / h)
 where h = 0.00001

It's awkward to have to write out the components individually and make the definition specific to a particular-dimensional vector space. A generic way of doing it:

import Data.VectorSpace
import Data.Basis

grad :: (HasBasis v, Scalar v ~ ℝ) => (v -> ℝ) -> v -> v
grad f x = recompose [ (e, (f (x ^+^ h*^basisValue b) - f x) ^/ h)
                     | (e,_) <- decompose x ]
 where h = 0.00001

Note that this pre-chosen-step–finite-differentiation is always a tradeoff between inaccuracy from higher-order terms and from floating-point errors, so definitely check out automatic differentiation.

ericArbour
  • 579
  • 8
  • 12
leftaroundabout
  • 117,950
  • 5
  • 174
  • 319
3

This is called automatic differentiation and there is a lot of really neat work in this area in Haskell, though I don't know how accessible it is.

From the wiki page:

luqui
  • 59,485
  • 12
  • 145
  • 204