2

Say I have an algorithm which operates on an input of size n and I know that the time it takes for n is twice the time it takes for n-1. I can observe in this simple case (assuming it takes, say, 1 second for n = 0) that the algorithm takes 2n seconds.

Is there a general method for converting between recursively-defined definitions to the more familiar direct type of expression?

BlueRaja - Danny Pflughoeft
  • 84,206
  • 33
  • 197
  • 283
NellerLess
  • 955
  • 1
  • 8
  • 9

3 Answers3

8

Master Theorem

In particular:

With T(n) = aT(n/b) + nc

If logba < c, then T(n) = O(nc)

If logba = c, then T(n) = O(nclog[n])

If logba > c, then T(n) = O(nlogba)

That's one useful theorem to know, but doesn't fully answer your question.

What you are looking for is the generator function of a recurrence relation. In general, these are only solvable for very simple cases, i.e. when f(n) = Af(n-1) + Bf(n-1) and f(0) = f(1) = 1 (or f(1) = A). Other recurrence relations are very difficult to solve.

See linear recurrence relation for more info.

Peter Alexander
  • 53,344
  • 14
  • 119
  • 168
3

"Recursive functions" like this are called Recurrence Relations, and their "direct types" are known as the Closed-form solution.

While the Master Theorem listed by Poita is very helpful in computing time-complexity, it has nothing to do with actually solving recurrence relations.

Wikipedia and Wolfram's Math World (under "See Also") list the closed-forms of some common classes of recurrence relations. However, complicated (non-linear) recurrence relations can be very difficult to find closed-form solutions to, if one exists at all. There is no general algorithm for solving them.

BlueRaja - Danny Pflughoeft
  • 84,206
  • 33
  • 197
  • 283
0

If it's linear, you can express the relation as a matrix and find the Eigen values, decomposing it into a form that lets you raise the eigen values to a power, as worked through for Fibonacci here.

Community
  • 1
  • 1
Pete Kirkham
  • 48,893
  • 5
  • 92
  • 171