29

I am trying to find complexity of Fibonacci series using a recursion tree and concluded height of tree = O(n) worst case, cost of each level = cn, hence complexity = n*n=n^2

How come it is O(2^n)?

Sled
  • 18,541
  • 27
  • 119
  • 168
Suri
  • 3,287
  • 9
  • 45
  • 75
  • possible duplicate of [Computational complexity of Fibonacci Sequence](http://stackoverflow.com/questions/360748/computational-complexity-of-fibonacci-sequence) – hammar Sep 25 '11 at 17:30
  • It is theta (F_N) (and so O(2^N)), even if you consider the complexity of adding two n bits numbers to be Polynomial in N, I believe. – user127.0.0.1 Sep 25 '11 at 18:15
  • As a followup question, is the time complexity equal to $n^2$ provided we store each $F_k$ all the way up to $F_n$? Since in this case we are merely performing a total of $n$ additions, where the length of $F_k$ is $O(k)$. – rondo9 Apr 08 '15 at 22:07

9 Answers9

37

The complexity of a naive recursive fibonacci is indeed 2ⁿ.

T(n) = T(n-1) + T(n-2) = T(n-2) + T(n-3) + T(n-3) + T(n-4) = 
= T(n-3) + T(n-4) + T(n-4) + T(n-5) + T(n-4) + T(n-5) + T(n-5) + T(n-6) = ...

In each step you call T twice, thus will provide eventual asymptotic barrier of:
T(n) = 2⋅2⋅...⋅2 = 2ⁿ

bonus: The best theoretical implementation to fibonacci is actually a close formula, using the golden ratio:

Fib(n) = (φⁿ – (–φ)⁻ⁿ)/sqrt(5) [where φ is the golden ratio]

(However, it suffers from precision errors in real life due to floating point arithmetics, which are not exact)

amit
  • 175,853
  • 27
  • 231
  • 333
  • if we have recusion T(n) = T(n/2)+T(n/2) then will complexity be 2^n/2.please correct me if i am wrong? – Suri Sep 25 '11 at 17:37
  • 1
    @Suri: in your example [in the comment] it is different, since n is decreasing exponentially as well in T(n): `T(n) = T(n/2) + T(n/2) = T(n/4) + T(n/4) + T(n/4) + T(n/4) = ... = T(n/2^logn) + ... + T(n/2^logn) [2^logn times] = 2^logn = n` – amit Sep 25 '11 at 17:49
  • 3
    @amit- Note that while you call T twice, it's not on the same level and 2^n is not a tight bound. For example, to compute F(n), you only compute F(n - 1) once. – templatetypedef Sep 25 '11 at 18:17
  • @templatetypedef: I deliberately avoided using the word 'tight' or 'exactly', since it is obviously not the case. This answer doesn't even prove the asymptotic bound, it just meant to show the OP a basic tool to roughly evaluate this complexity. – amit Sep 25 '11 at 18:20
  • @amit Isn't T(n) = 2+2^2+2^3+2^4 ⋅...⋅2 = 2ⁿ rather than T(n) = 2⋅2⋅...⋅2 = 2ⁿ .This is to clarify my understanding. Aren't we adding the sum of the nodes at each level of the recursion tree ? – crackerplace Dec 30 '14 at 21:54
  • @amit Also, T(n) = T(n − 1) + T(n − 2) + 1, assuming there is some work to add the results of the 2 recursive calls.So ,What I am unable to get clearly is that, are we just counting the number of method invocations at each level of the tree or even the work done by each of these invocation's i.e the addition.Of course we might have ignored that, assuming its constant.So does that mean we are interested in overall time taken for these recursive invocations by the runtime and not the actual work(comparing if n=0 or 1 ,addition etc) done by the runtime for each of these invocations. – crackerplace Dec 30 '14 at 22:45
  • re bonus: Total nitpick but note: at a high level this is true, but if you take the algorithm at a lower level, the # of operations floating point arithmetic and exponentiation could make it a worse O than the memoization option. (I think it depends on the system). That said, the closed formula is cool! – ThinkBonobo Feb 25 '15 at 16:47
14

The recursion tree for fib(n) would be something like :

                              n       
                           /     \
                          n-1    n-2      --------- maximum 2^1 additions
                         /  \    /   \
                       n-2   n-3 n-3 n-4  -------- maximum 2^2 additions
                      /   \           
                     n-3 n-4              -------- maximum 2^3 additions         
                                                ........
                                          -------- maximum 2^(n-1) additions  
  1. Using n-1 in 2^(n-1) since for fib(5) we will eventually go down to fib(1)
  2. Number of internal nodes = Number of leaves - 1 = 2^(n-1) - 1
  3. Number of additions = Number of internal nodes + Number of leaves = (2^1 + 2^2 + 2^3 + ...) + 2^(n-1)
  4. We can replace the number of internal nodes to 2^(n-1) - 1 because it will always be less than this value : = 2^(n-1) - 1 + 2^(n-1) ~ 2^n
Anum Malik
  • 554
  • 1
  • 7
  • 11
7

You are correct that the depth of the tree is O(n), but you are not doing O(n) work at each level. At each level, you do O(1) work per recursive call, but each recursive call then contributes two new recursive calls, one at the level below it and one at the level two below it. This means that as you get further and further down the recursion tree, the number of calls per level grows exponentially.

Interestingly, you can actually establish the exact number of calls necessary to compute F(n) as 2F(n + 1) - 1, where F(n) is the nth Fibonacci number. We can prove this inductively. As a base case, to compute F(0) or F(1), we need to make exactly one call to the function, which terminates without making any new calls. Let's say that L(n) is the number of calls necessary to compute F(n). Then we have that

L(0) = 1 = 2*1 - 1 = 2F(1) - 1 = 2F(0 + 1) - 1

L(1) = 1 = 2*1 - 1 = 2F(2) - 1 = 2F(1 + 1) - 1

Now, for the inductive step, assume that for all n' < n, with n ≥ 2, that L(n') = 2F(n + 1) - 1. Then to compute F(n), we need to make 1 call to the initial function that computes F(n), which in turn fires off calls to F(n-2) and F(n-1). By the inductive hypothesis we know that F(n-1) and F(n-2) can be computed in L(n-1) and L(n-2) calls. Thus the total runtime is

1 + L(n - 1) + L(n - 2)

= 1 + 2F((n - 1) + 1) - 1 + 2F((n - 2) + 1) - 1

= 2F(n) + 2F(n - 1) - 1

= 2(F(n) + F(n - 1)) - 1

= 2(F(n + 1)) - 1

= 2F(n + 1) - 1

Which completes the induction.

At this point, you can use Binet's formula to show that

L(n) = 2(1/√5)(((1 + √5) / 2)n - ((1 - √5) / 2)n) - 1

And thus L(n) = O(((1 + √5) / 2)n). If we use the convention that

φ = (1 + √5) / 2 &approx; 1.6

We have that

L(n) = Θ(φn)

And since φ < 2, this is o(2n) (using little-o notation).

Interestingly, I've chosen the name L(n) for this series because this series is called the Leonardo numbers. In addition to its use here, it arises in the analysis of the smoothsort algorithm.

Hope this helps!

templatetypedef
  • 362,284
  • 104
  • 897
  • 1,065
  • thanks for ur reply i understood your point but if we have recursion T(n) = T(n/2)+T(n/2) then will complexity be 2^n/2.please correct me if i am wrong? – Suri Sep 25 '11 at 17:43
  • @Suri- The recurrence T(n) = 2T(n / 2), T(1) = 1 solves to O(n), I believe. You should post that as a separate question so that people can give you a more detailed answer. – templatetypedef Sep 25 '11 at 17:47
  • Downvoter- Can you please explain what's wrong with this answer? I believe that it's mathematically correct and does indeed answer the question. If I'm wrong about this, please let me know what I can do to improve the answer. – templatetypedef Sep 25 '11 at 18:19
  • @templatetypedef: Yes, it does. If the complexity of `T(k)` is at most `k` for `k <= n - 1` then the complexity of `T(n)` is at most `T(n) = T(n / 2) + T(n / 2) <= 2 * n / 2 = n`. – jason Sep 25 '11 at 18:26
  • I love this solution. In Java it's just: `double phi = 1.6180339887; return Math.round((Math.pow(phi, n) / Math.sqrt(5)));` – Hal50000 Jul 08 '15 at 13:09
  • It's easier if you define the recursion as F(1)=F(2)=1, F(n)=F(n-1)+F(n-2). Then you can see that each call either corresponds to an addition, or a return of 1. So if you expand out the recursion symbolically -- say F(5) = F(4) + F(3) = (F(3)+F(2)) + (F(2)+F(1)) = ((F(2)+F(1)) + F(1) + (F(2)+F(1)) = ((1+1)+1)+(1+1), it's obvious that there's F(n) 1's (since they must add up to F(n)), and F(n)-1 additions. Then the number of recursive calls is 2F(n)-1. – Paul Hankin Jul 27 '17 at 07:32
6

t(n)=t(n-1)+t(n-2) which can be solved through tree method:

                                  t(n-1)  +  t(n-2)        2^1=2
                                   |         |  
                            t(n-2)+t(n-3)  t(n-3)+t(n-4)   2^2=4  
                                .               .          2^3=8
                                .               .           .
                                .               .           .

similarly for the last level . . 2^n
it will make total time complexity=>2+4+8+.....2^n after solving the above gp we will get time complexity as O(2^n)

mukesh
  • 111
  • 2
  • 1
6

Look at it like this. Assume the complexity of calculating F(k), the kth Fibonacci number, by recursion is at most 2^k for k <= n. This is our induction hypothesis. Then the complexity of calculating F(n + 1) by recursion is

F(n + 1) = F(n) + F(n - 1)

which has complexity 2^n + 2^(n - 1). Note that

2^n + 2^(n - 1) = 2 * 2^n / 2 + 2^n / 2 = 3 * 2^n / 2 <= 2 * 2^n = 2^(n + 1).

We have shown by induction that the claim that calculating F(k) by recursion is at most 2^k is correct.

jason
  • 236,483
  • 35
  • 423
  • 525
  • Where is the basis in your induction? Without the basis I can prove virtually anything by induction. – AnT stands with Russia Sep 25 '11 at 17:35
  • @Jason- Actually, it's really important to get this right. It is *extremely* easy to do inductive proofs of big-O runtimes incorrect because you can accidentally end up hiding an exponential term in the big-O constants. For example, I'll prove that T(n) = 2T(n - 1), T(1) = 1 is O(1). As a base case, T(1) = 1 = O(1). For the inductive step, if the claim holds for n' < n, then T(n) = 2T(n - 1) = 2 O(1) = O(1). But this function is really O(2^n). In other words, these sorts of proofs are much harder than they look. – templatetypedef Sep 25 '11 at 17:58
  • @templatetypedef: However, in this case one can easily prove by induction T(n) <= 2^n. base is trivial. `T(n+1) = T(n) + T(n-1) <= 2^n + 2^(n-1) = 3*2^(n-1) < 4*2^(n-1) = 2^(n+1)`. [correct me if I'm wrong] – amit Sep 25 '11 at 18:04
  • 1
    @amit- Yes, you're absolutely correct. The point I'm trying to make is that it's not sufficient to prove that the runtime is O(f(n)) by induction for any f(n), and that you have to give an explicit function that you're trying to prove the runtime never exceeds. But definitely in this case you can show a bound of 2^n. – templatetypedef Sep 25 '11 at 18:06
  • @templatetypedef: I see your point, in this case, in order to prove O(2^n), one should actually give the bound of c=1, and prove `T(n)<=1*2^n`. thanks for this comment. – amit Sep 25 '11 at 18:12
  • 1
    @templatetypedef: I understand the point that you're trying to make, but are you saying I got it wrong? – jason Sep 25 '11 at 18:24
  • @Jason- No, your result is correct. I was mostly concerned because the argument you made to support it was invalid. Since you just updated the answer, I've removed the downvote. – templatetypedef Sep 25 '11 at 18:28
  • @Jason: No, I'm not joking. The proof is trivial as well. I'm not saying your proof is incorrect. I'm just saying that it is formally incomplete. – AnT stands with Russia Sep 25 '11 at 19:40
  • 1
    @AndreyT: Have you ever looked at a math journal? Are you aware of the phrases "left as an exercise to the read", "the proof is obvious", etc.? – jason Sep 25 '11 at 20:07
  • @Jason: Well, I repeat: the entire proof in this case is pretty obvious. Yet, for some reason you decided to spell out the inductive step. At the same time you don't even mention the basis (not even in "left as an exercise to the read", or as "the proof is obvious"). Your dispropotionally inflamed reaction is probably caused by your assumption that I downvoted. I didn't downvote. – AnT stands with Russia Sep 25 '11 at 20:15
  • @AndreyT: Given there are no downvotes on my post, I don't think you downvoted. Inflamed reaction? Are you okay? – jason Sep 25 '11 at 20:23
  • 1
    @Jason: Er... I'm a bit surprised that I have to explain something as simple, but anyway... Given that this discussion has been ongoing for a certain period of time, I think you'd agree that what matters is whether there *was* a downvote on your post. The fact that there's no downvote at this moment is hardly relevant. And there *was* a downvote, wasn't there? – AnT stands with Russia Sep 25 '11 at 22:39
  • @AndreyT: Okay, this is getting ridiculous, and very frustrating. I never made the assumption that you downvoted. That's all that matters, not whether or not there was a downvote on my post. – jason Sep 25 '11 at 22:55
1

The complexity of recursive Fibonacci series is 2^n:
This will be the Recurrence Relations for recursive Fibonacci

 T(n)=T(n-1)+T(n-2)                  No  of elements  2

Now on solving this relation using substitution method (substituting value of T(n-1) and T(n-2))

T(n)=T(n-2)+2*T(n-3)+T(n-4)          No  of elements  4=2^2

Again substituting values of above term we will get

T(n)=T(n-3)+3*T(n-4)+3*T(n-5)+T(n-6)  No  of elements  8=2^3

After solving it completely, we get

T(n)={T(n-k)+---------+---------}----------------------------->2^k  eq(3) 

This implies that maximum no of recursive calls at any level will be at most 2^n.
And for all the recursive calls in equation 3 is ϴ(1) so time complexity will be 2^n* ϴ(1)=2^n

Tanuj Yadav
  • 1,259
  • 13
  • 21
Mohit Yadav
  • 471
  • 8
  • 17
1

The complexity of Fibonacci series is O(F(k)), where F(k) is the kth Fibonacci number. This can be proved by induction. It is trivial for based case. And assume for all k<=n, the complexity of computing F(k) is c*F(k) + o(F(k)), then for k = n+1, the complexity of computing F(n+1) is c*F(n) + o(F(n)) + c*F(n-1) + o(F(n-1)) = c*(F(n) + F(n-1)) + o(F(n)) + o(F(n-1)) = O(F(n+1)).

Zhongyuan
  • 56
  • 2
  • 2
    In general this sort of argument doesn't work because you have to be extremely precise about what the constant factors are in the big-O terms. Doing induction with big-O can easily lead you to prove completely incorrect results where at each step the math is right, but because you're hiding progressively larger and larger constants inside the big-O term you end up proving something that grows exponentially quickly actually does not. To prove this more formally, you would have to actually come up with the constants n0 and c. – templatetypedef Sep 25 '11 at 17:43
  • @template: Did you notice the smallOh? There is a big difference between smallOh and BigOh... – user127.0.0.1 Sep 26 '11 at 04:08
  • 1
    @user127.0.0.1- I don't believe that changes things here; a similarly flawed inductive proof could be made that way. Again, my complaint isn't the result as much as the method. – templatetypedef Sep 26 '11 at 04:22
  • @template: I was mainly pointing out that your comment about constants is not applicable to this proof. The proof is flawed, of course. Fundamentally, it is meaningless to talk about asymptotics when you restrict yourself to finite n. (i.e. the statement `T(k) = C*F(k) + o(F(k)) for k <= n` is meaningless). – user127.0.0.1 Sep 26 '11 at 15:31
0

The O(2^n) complexity of Fibonacci number calculation only applies to the recursion approach. With a few extra space, you can achieve a much better performance with O(n).

public static int fibonacci(int n) throws Exception {

  if (n < 0)
    throws new Exception("Can't be a negative integer")

  if (n <= 1)
    return n;

  int s = 0, s1 = 0, s2 = 1;
  for(int i= 2; i<=n; i++) {
    s = s1 + s2;
    s1 = s2;
    s2 = s;            
  }
  return s;
}
LarsTech
  • 80,625
  • 14
  • 153
  • 225
vic
  • 2,548
  • 9
  • 44
  • 74
0

I cannot resist the temptation of connecting a linear time iterative algorithm for Fib to the exponential time recursive one: if one reads Jon Bentley's wonderful little book on "Writing Efficient Algorithms" I believe it is a simple case of "caching": whenever Fib(k) is calculated, store it in array FibCached[k]. Whenever Fib(j) is called, first check if it is cached in FibCached[j]; if yes, return the value; if not use recursion. (Look at the tree of calls now ...)