6

I'm given this function and want to know the time complexity:

int f2(int n) {
    if(n<2) return 1;
    return f2(n/2)+f2(n-2);
}

I calculated its runtime to be O(n2) using the telescopic expansion method. Is this correct?

Edit: After reconsidering, I found that this function has a similar structure to mergesort, which has complexity Θ(n log n). Is that correct?

templatetypedef
  • 362,284
  • 104
  • 897
  • 1,065
Caesar23
  • 93
  • 7

6 Answers6

3

Suppose that it's polynomial in n, i.e. that the running time T(n) is at most a*nb for some constants a and b. Then, we have:

T(n) = T(n/2) + T(n-2)
     = a*(n/2)^b + a*(n-2)^b
     = a/2^b * n^b + a*n^b + (lower-order terms)
     = a*n^b * (1/2^b + 1) + (lower-order terms)

Where the (lower-order terms) consist of sums and differences of n raised to powers strictly less than b. For sufficiently large n, the a*nb * (1/2b + 1) term will dominate the lower-order terms, and that will always be larger than a*nb, so it's not true that T(n) is O(nb) for any constant b.

Now suppose that it's exponential in n, i.e. that T(n) is at most a*bn for some constants a and b. Then:

T(n) = T(n/2) + T(n-2)
     = a*b^(n/2) + a*b^(n-2)
     = a*b^n * (b^(-n/2) + 1/b^2)

Now if b > 1 and n >= -2 log(1 - 1/b2) / log(b), then:

n >= -2 log(1 - 1/b^2) / log(b)
-n/2 <= log(1 - 1/b^2) / log(b) = log_b(1 - 1/b^2)
b^(-n/2) <= 1 - 1/b^2
b^(-n/2) + 1/b^2 <= 1
T(n) = a*b^n * (b^(-n/2) + 1/b^2) <= a*b^n * 1 = a*b^n

Hence, for any a > 0 and b > 1, there exists an N (equal to -2 log(1 - 1/b2) / log(b)) such that for all n >= N, T(n) <= a*b^n.

So what does this say? T(n) is more than polynomial (for any polynomial; that also includes polylogarithmic functions, such any polylogarithmic function is o(a larger non-logarithmic polynomial)), but T(n) is less than any exponential. I'm not sure if there's a simple formulation for the exact result, but those are some useful bounds.

EDIT

As nhahtdh points out, this makes the running time something called quasi-polynomial time.

Adam Rosenfield
  • 390,455
  • 97
  • 512
  • 589
1

TL;DR: I think the recurrence is nO(log n), which is super-polynomial but subexponential. I don't have a matching lower bound and I doubt this is tight, but it's progress!

I think I can get an upper bound that's subexponential but super-polynomial. Combined with @Adam Rosenfield's answer that shows that there's no polynomial bound, this shows that

  • The recurrence has no polynomial bound, but
  • Any exponential bound must be weak.

The recurrence we want to solve is

T(n) = T(n - 2) + T(n / 2) + 1

T(1) = T(2) = 1

One idea I had was that we might be able to evaluate this recurrence in a particular order. Suppose you want to evaluate T(n). Doing this for one step yields

T(n - 2) + T(n / 2) + 1

Now, suppose we expand out the T(n - 2) term. This gives

T(n - 4) + T(n / 2 - 1) + T(n / 2) + 2

Now, expand the n - 4 term. Doing this gives

T(n - 6) + T(n / 2 - 2) + T(n / 2 - 1) + T(n / 2) + 3

We can repeat the process of expanding out the subtractive term (the term of the form T(n - 2k)) over and over again. So what happens if we expand it out to the point where n - 2k is about n / 2? That's going to happen about n / 4 times. If we do this, we get that

T(n) = T(n / 2) + T(n / 2) + T(n / 2 - 1) + T(n / 2 - 2) + ... + T(n / 2 - n / 4) + n / 4

I'm going to make the assumption that T(n) is nondecreasing. If we do this, we get that

T(n) ≤ T(n / 2) + T(n / 2) + T(n / 2) + ... + T(n / 2) + n / 4

T(n) ≤ (n / 4 + 1) T(n / 2) + n / 4

I'm going to be a bit rough with the math here and make another simplification: instead of having the coefficient of T(n / 2) be (n / 4 + 1), I'm just going to have it be n / 4. I know this isn't safe, but I'm going to conjecture that it doesn't mess with the result too much. It would be a good idea to double-check that the rest of this still works after doing that, though!

This means that we can simplify our above recurrence (roughly) to get that

T(n) ≤ (n / 4) T(n / 2) + n / 4

This is something much easier to work with and we can now try to solve it directly. We'll use the iteration method and just keep plugging it into itself over and over until we spot a pattern. If we do this, we see the following pattern:

T(n) ≤ (n / 4) T(n / 2) + n / 4

≤ (n / 4)((n / 8) T(n / 4) + n / 8) + n / 4

= (n2 / 32) T(n / 4) + n / 4 + n2 / 32

≤ (n2 / 32) ((n / 16) T(n / 8) + n / 16) + n / 4 + n2 / 32

= (n3 / 512) T(n / 8) + n / 4 + n2 / 32 + n3 / 512

≤ (n3 / 512) ((n / 32) T(n / 16) + n / 32) + n / 4 + n2 / 32 + n3 / 512

= (n4 / 16384) T(n / 16) + n / 4 + n2 / 32 + n3 / 512 + n4 / 16384

Let's look at the pattern that's emerging. After doing this k times, the coefficient of T(n / 2k) is nk divided by some power of two. Those powers of two so far are 4, 32, 512, 16384. Those numbers don't appear to have any obvious pattern to them, but if we rewrite them as 22, 25, 29, 214 we see that they follow the pattern 0, 2, 5, 9, 14, ... . This is one less than the triangular numbers 1, 3, 6, 10, 15, ... . This isn't a coincidence: if you think about it, the denominator keeps getting multiplied by the next power of two every time we expand out the recurrence, which increases the exponents on the powers of two from one triangular number to the next. Therefore, we'd expect the denominator after k iterations to be given by

2(k + 1)(k + 2) / 2 - 1

After doing that, the remaining terms are the sum of powers of n divided by powers of two raised to various triangular numbers. Therefore, if we expand things out k times, we get

T(n) ≤ (nk / 2(k + 1)(k + 2) / 2 - 1) T(n / 2k) + Σi = 0k (ni / 2(i + 1)(i + 2) / 2 - 1)

Okay! So that's a mess, but it's not all that bad. We know that the recursion stops as soon as n / 2k = 1, which happens when k = lg n. To make things a bit simpler, I'm going to change the base case so that the recursion stops when n / 2k = 2, which happens when k = lg n - 1. This doesn't change anything by more than a constant factor. So, if we substitute in k = lg n - 1into the above formula, we can simplify to get a closed-form expression for an upper bound. Doing this gives the following:

T(n) ≤ (nk / 2(k + 1)(k + 2) / 2 - 1) T(n / 2k) + Σi = 0k (ni / 2(i + 1)(i + 2) / 2 - 1)

= (nlg n - 1 / 2(lg n)(lg n + 1) / 2 - 1) T(2) + Σi = 0lg n - 1 (ni / 2(i + 1)(i + 2) / 2 - 1)

Oof, that's not pretty. However, it's not as bad as it might seem. Let's look at the denominator of the first term:

2(lg n)(lg n + 1) / 2 - 1

Using basic laws of exponents, we get that

2(lg n)(lg n + 1) / 2 - 1

= (2lg n)(lg n + 1) / 2 - 1

= n(lg n + 1) / 2 - 1

That's not so bad! So if we take the expression

nlg n - 1 / 2(lg n)(lg n + 1) / 2 - 1

We see that it simplifies as follows:

nlg n - 1 / 2(lg n)(lg n + 1) / 2 - 1

= nlg n - 1 / n(lg n + 1) / 2 - 1

= n(lg n - 1 - ((lg n + 1) / 2 - 1))

= n(lg n - (lg n + 1) / 2)

= n(2 lg n - lg n - 1) / 2)

= n(lg n - 1) / 2)

= nO(lg n)

Nifty! That leading term ends up being of the form nO(lg n)!

So what about the rest? Well, we have this awful summation to contend with:

Σi = 0lg n - 1 (ni / 2(i + 1)(i + 2) / 2 - 1)

Fortunately, one thing we can note is that

ni / 2(i + 1)(i + 2) / 2 - 1

≤ ni / 2(i + 1)(i + 2)

≤ ni / 2(i + 1)

≤ ni / 2i

= (n / 2)i

So if all we want to do is upper-bound the summation, we can do so by using this much simpler sum:

Σi = 0lg n - 1 (n / 2)i

Since n / 2 > 0, this sum is in turn upper bounded by

Σi = 0lg n - 1 (n / 2)lg n - 1

= (lg n)(nlg n - 1) / (2lg n - 1)

= (2 lg n)(nlg n - 1) / n

= (2 lg n)nlg n - 2

= nO(lg n)

And bingo, the second term in that awful sum is also nO(lg n)! This means that we have that T(n) = nO(log n). This bound isn't necessarily tight, but it does show that the the recurrence is definitely sub-exponential, since nO(log n) grows more slowly than any exponential function!

Hope this helps!

nhahtdh
  • 55,989
  • 15
  • 126
  • 162
templatetypedef
  • 362,284
  • 104
  • 897
  • 1,065
  • Thank you for your answer , and to answer your previous question , the telescopic expansion method is basically what you did in this answer , which is plugging the formula into itself until a pattern presents itself :) – Caesar23 Sep 18 '13 at 08:05
0

I think it's O(2^sqrt(n))

You have T(n) = T(n/2) + T(n-2).

Plugging in the solution, dividing by 2^sqrt(n), and given n → ∞:

2^sqrt(n) = 2^sqrt(n/2)  +  2^sqrt(n-2)
---------   -----------     -----------
2^sqrt(n)    2^sqrt(n)       2^sqrt(n)

1 = 2^sqrt(-n/2) + 1

And 2^sqrt(-n/2) → 0

Karol S
  • 9,028
  • 2
  • 32
  • 45
  • No , I don't think that's accurate . For example for n=10 you'll get a call tree with 14 leaves , whereas your solution suggests that it shouldn't pass roughly 9 – Caesar23 Sep 17 '13 at 17:55
  • I never specified the constant. I think `T(n)/2^sqrt(n) → C`, but I don't know the C and I'd have to check it. – Karol S Sep 17 '13 at 18:04
0

T(n) = T(n/2) + T(n-2) + 1

we know that T(n-2) > T(n/2)

we can say that:

T(n) = 2T(n-2) + 1

so T(n-2) = 2T(n-4) + 1

T(n) = 4T(n-4) + 2

T(n) = 8T(n-6) + 3

T(n) = (2^k)T(n-2k) + k

since we know that T(1) = 1

then n-2k = 1 -> k = (n-1)/2

T(n) = 2^((n-1)/2)T(1) + (n-1)/2

T(n) = 2^((n-1)/2) + (n-1)/2

T(n) = 2^((n-1)/2)

There must be a better solution though.

Maher
  • 294
  • 1
  • 13
0

There's always the empirical approach — time it...

#include "timer.h"
#include <stdio.h>

static
int f2(int n)
{
    if (n < 2)
        return 1;
    return f2(n/2) + f2(n-2);
}

int main(void)
{

    printf("Fast:\n");
    for (int i = 1; i <= 512; i *= 2)
    {
        Clock clk;
        clk_init(&clk);
        clk_start(&clk);
        long sum = 0;
        for (int j = 0; j < 100; j++)
            sum += f2(i);
        clk_stop(&clk);
        char buffer[32];
        printf("%8d: elapsed %10s s; sum = %12ld\n",
               i, clk_elapsed_us(&clk, buffer, sizeof(buffer)), sum/100);
    }

    printf("\nSlow:\n");
    for (int i = 512; i < 1024; i += 16)
    {
        Clock clk;
        clk_init(&clk);
        clk_start(&clk);
        long sum = 0;
        for (int j = 0; j < 100; j++)
            sum += f2(i);
        clk_stop(&clk);
        char buffer[32];
        printf("%8d: elapsed %7s s; sum = %12ld\n",
               i, clk_elapsed_ms(&clk, buffer, sizeof(buffer)), sum/100);
    }

    return 0;
}

Results:

For the record, the test was run on a MacBook Pro running Mac OS X 10.8.5, Intel Core i7 at 2.3 GHz, with 16 GiB RAM.

Fast:
       1: elapsed   0.000000 s; sum =            1
       2: elapsed   0.000001 s; sum =            2
       4: elapsed   0.000001 s; sum =            4
       8: elapsed   0.000002 s; sum =           10
      16: elapsed   0.000008 s; sum =           36
      32: elapsed   0.000045 s; sum =          202
      64: elapsed   0.000408 s; sum =         1828
     128: elapsed   0.005985 s; sum =        27338
     256: elapsed   0.139971 s; sum =       692004
     512: elapsed   5.273834 s; sum =     30251722

Slow:
     512: elapsed   5.286 s; sum =     30251722
     528: elapsed   6.370 s; sum =     36243644
     544: elapsed   7.609 s; sum =     43234278
     560: elapsed   8.949 s; sum =     51361196
     576: elapsed  10.397 s; sum =     60777212
     592: elapsed  12.171 s; sum =     71651844
     608: elapsed  14.394 s; sum =     84172786
     624: elapsed  16.716 s; sum =     98547380
     640: elapsed  19.176 s; sum =    115004102
     656: elapsed  21.985 s; sum =    133794468
     672: elapsed  25.295 s; sum =    155194954
     688: elapsed  29.170 s; sum =    179508916
     704: elapsed  33.456 s; sum =    207068524
     720: elapsed  38.317 s; sum =    238237116
     736: elapsed  43.776 s; sum =    273411566
     752: elapsed  49.878 s; sum =    313024652
     768: elapsed  56.979 s; sum =    357547444
     784: elapsed  64.456 s; sum =    407492292
     800: elapsed  72.980 s; sum =    463415834
     816: elapsed  82.535 s; sum =    525922004
     832: elapsed  93.062 s; sum =    595665060
     848: elapsed 104.886 s; sum =    673353212
     864: elapsed 118.038 s; sum =    759752270
     880: elapsed 132.569 s; sum =    855689292
     896: elapsed 150.487 s; sum =    962056258
     912: elapsed 166.065 s; sum =   1079814524
     928: elapsed 185.616 s; sum =   1209999302
     944: elapsed 206.875 s; sum =   1353724140
     960: elapsed 230.670 s; sum =   1512185428
     976: elapsed 256.718 s; sum =   1686667684
     992: elapsed 285.158 s; sum =   1878548866
    1008: elapsed 316.281 s; sum =   2089305684

I've not gone curve fitting that data. Clearly, you could reduce or eliminate the iterations at the larger sizes, and so on and so forth (and that would avoid overflows that occur with 1024, and ...).

If someone's got some expertise in drawing graphs, it might show the result (logarithmic scale for the time).

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
-1

I think it is exponential. Why? Every f(n) leads to calling f(n/2) and f(n-2).

Check this question for details:

Computational complexity of Fibonacci Sequence

This is fibonacci order complexity. Your function structure is similar.

Community
  • 1
  • 1
Manoj Awasthi
  • 3,460
  • 2
  • 22
  • 26
  • 5
    There's a huge difference between two subcalls to sizes n-1 and n-2 and two subcalls to sizes n-2 and n/2. Recurring on sizes n/2 can lead to polynomial runtimes; see the master theorem for some examples. I think you need to provide more evidence for your claim. – templatetypedef Sep 17 '13 at 17:47