1

I coded three versions of the fibonacci sequence and I'm not quite sure about their time/space complexities (and why):

Variant 1: Head recursion

 int fibonacci_h(int n) {
  if (n == 1 || n == 2) {
    return 1;
  }
  return fibonacci_h(n - 1) + fibonacci_h(n - 2);
}

Variant 2: Tail recursion

int fibonacci_t(int n, int s_last = 0, int last = 1) {
  if (n == 1) {
    return last;
  }
  return fibonacci_t(n - 1, last, s_last + last);
}

Variant 3: Head recursion with cache

int fibonacci_hash(int n, unordered_map<int, int>* fib_hash = new unordered_map<int, int>{{1, 1},{2, 1}}) { 
  if((*fib_hash).find(n)!=((*fib_hash).end())){
    return  (*fib_hash)[n];
  }
  int result = fibonacci_hash(n - 1, fib_hash) + fibonacci_hash(n - 2, fib_hash);
  (*fib_hash)[n] = result;
  return result;
} 

Usage

int main() {
  int n = 10;
  cout << fibonacci_h(n) << endl;
  cout << fibonacci_t(n) << endl;
  cout << fibonacci_hash(n) << endl;
} // Output: 55

Greatly appreciated!

Michel H
  • 317
  • 3
  • 10

2 Answers2

2

The one that does naïve recursion -- you call it head recursion -- is exponential, O(2^n). The other two are O(n). The memoization one and the tail recursion one are both O(n) space though. In the former the hash table size will be O(n) and in the latter the call stack's depth will be O(n), without tail recursion optimization. With tail recursion optimization, that one is essentially compiling into the iterative version which is O(n) time and O(1) space.

jwezorek
  • 8,592
  • 1
  • 29
  • 46
  • Are you saying: 1. Time O(2^n) / Space O(n) (Order of exponential functions, as far as I know are not equivalent, so one should say what kind of exponential function right? 2. Time O(n) / Space O(n) I'm not quite sure about the space complexity on this one, since tail doesn't compound the call stacks (the return of the base-case ends up being the return of all the chained recursive calls). Why is this not O(1)? 3. Time O(n) / Space O(n) – Michel H Oct 23 '20 at 17:51
  • 1
    yeah the tail recursive one will not be O(n) space given tail recursion optimization. But anyway in terms of time 1 is O(2^n) and the other two are O(n). – jwezorek Oct 23 '20 at 17:56
  • So in space: 1. O(n) 2. O(1) 3. O(n)? Saying 2. not O(n), does not confirm it is O(1). – Michel H Oct 23 '20 at 18:09
  • 1
    yeah if you are counting the stack as part of "space" which I believe is standard and if the compiler of the tail recursion one uses tail recursion optimization. You know the other way to compute fibo(n) is to just do it iteratively which O(n) time O(1) space. There is also a way to compute it in log(n) time using Binet's formula. – jwezorek Oct 23 '20 at 19:03
  • The second version is [collapsed to iteration](https://godbolt.org/z/4YdY4v) from optimisation level -O2 upwards. Thus it will run in constant space. – bitmask Oct 23 '20 at 19:34
  • The head-recursion version is exponential, but it's actually much less than O(2^n). – einpoklum Oct 28 '20 at 23:38
  • @bitmask: Well, that's assuming an integer is O(1) space rather than O(log(n))... – einpoklum Oct 28 '20 at 23:40
0

As mentioned i this answer as well - the time complexity of your "head" version is Θ( ((1+sqrt(5))/2)^n ) ~= Θ(1.618^n) . This relates to the golden ratio.

enter image description here

asymptotically, (a+b)/a = a/b, i.e. 1+b/a = a/b ; and (1+sqrt(5))/2 is the solution to this equation. This characterizes conchs in nature too, apparently.

The space complexity of your "head" version is Θ(n), because that's determined by the maxium depth of the recursion (each level deep adds another constant number of local variables and stack info).

I'm ignoring the log(n) bits that go into a single integer. That is, you should actually multiply everything by log(n) time if you want to be super-pedantic.

einpoklum
  • 118,144
  • 57
  • 340
  • 684
  • I'm not sure that just because fibo(n) can be approximated by phi^n it means that the running time of the naive recursive version is O(phi^n). phi^n approximates *the value* of fibo(n) but the value appears nowhere in the recursive implementation. Where I was getting the 2 from in my claim of 2^n is that each "level" of recursion is going to make two times as many recursive calls as the level before. So the question is how many levels will there be ... which indeed will be something like log base phi of n. So the running time would be more like 2 ^ (log base phi of n) – jwezorek Oct 29 '20 at 16:53
  • "the value appears nowhere in the recursive implementation" <- Not so. It is in fact the case that the value and the number of calls to the function are the same (except for n=2). – einpoklum Oct 29 '20 at 17:02