TL;DR: I think the recurrence is nO(log n), which is super-polynomial but subexponential. I don't have a matching lower bound and I doubt this is tight, but it's progress!
I think I can get an upper bound that's subexponential but super-polynomial. Combined with @Adam Rosenfield's answer that shows that there's no polynomial bound, this shows that
- The recurrence has no polynomial bound, but
- Any exponential bound must be weak.
The recurrence we want to solve is
T(n) = T(n - 2) + T(n / 2) + 1
T(1) = T(2) = 1
One idea I had was that we might be able to evaluate this recurrence in a particular order. Suppose you want to evaluate T(n). Doing this for one step yields
T(n - 2) + T(n / 2) + 1
Now, suppose we expand out the T(n - 2) term. This gives
T(n - 4) + T(n / 2 - 1) + T(n / 2) + 2
Now, expand the n - 4 term. Doing this gives
T(n - 6) + T(n / 2 - 2) + T(n / 2 - 1) + T(n / 2) + 3
We can repeat the process of expanding out the subtractive term (the term of the form T(n - 2k)) over and over again. So what happens if we expand it out to the point where n - 2k is about n / 2? That's going to happen about n / 4 times. If we do this, we get that
T(n) = T(n / 2) + T(n / 2) + T(n / 2 - 1) + T(n / 2 - 2) + ... + T(n / 2 - n / 4) + n / 4
I'm going to make the assumption that T(n) is nondecreasing. If we do this, we get that
T(n) ≤ T(n / 2) + T(n / 2) + T(n / 2) + ... + T(n / 2) + n / 4
T(n) ≤ (n / 4 + 1) T(n / 2) + n / 4
I'm going to be a bit rough with the math here and make another simplification: instead of having the coefficient of T(n / 2) be (n / 4 + 1), I'm just going to have it be n / 4. I know this isn't safe, but I'm going to conjecture that it doesn't mess with the result too much. It would be a good idea to double-check that the rest of this still works after doing that, though!
This means that we can simplify our above recurrence (roughly) to get that
T(n) ≤ (n / 4) T(n / 2) + n / 4
This is something much easier to work with and we can now try to solve it directly. We'll use the iteration method and just keep plugging it into itself over and over until we spot a pattern. If we do this, we see the following pattern:
T(n) ≤ (n / 4) T(n / 2) + n / 4
≤ (n / 4)((n / 8) T(n / 4) + n / 8) + n / 4
= (n2 / 32) T(n / 4) + n / 4 + n2 / 32
≤ (n2 / 32) ((n / 16) T(n / 8) + n / 16) + n / 4 + n2 / 32
= (n3 / 512) T(n / 8) + n / 4 + n2 / 32 + n3 / 512
≤ (n3 / 512) ((n / 32) T(n / 16) + n / 32) + n / 4 + n2 / 32 + n3 / 512
= (n4 / 16384) T(n / 16) + n / 4 + n2 / 32 + n3 / 512 + n4 / 16384
Let's look at the pattern that's emerging. After doing this k times, the coefficient of T(n / 2k) is nk divided by some power of two. Those powers of two so far are 4, 32, 512, 16384. Those numbers don't appear to have any obvious pattern to them, but if we rewrite them as 22, 25, 29, 214 we see that they follow the pattern 0, 2, 5, 9, 14, ... . This is one less than the triangular numbers 1, 3, 6, 10, 15, ... . This isn't a coincidence: if you think about it, the denominator keeps getting multiplied by the next power of two every time we expand out the recurrence, which increases the exponents on the powers of two from one triangular number to the next. Therefore, we'd expect the denominator after k iterations to be given by
2(k + 1)(k + 2) / 2 - 1
After doing that, the remaining terms are the sum of powers of n divided by powers of two raised to various triangular numbers. Therefore, if we expand things out k times, we get
T(n) ≤ (nk / 2(k + 1)(k + 2) / 2 - 1) T(n / 2k) + Σi = 0k (ni / 2(i + 1)(i + 2) / 2 - 1)
Okay! So that's a mess, but it's not all that bad. We know that the recursion stops as soon as n / 2k = 1, which happens when k = lg n. To make things a bit simpler, I'm going to change the base case so that the recursion stops when n / 2k = 2, which happens when k = lg n - 1. This doesn't change anything by more than a constant factor. So, if we substitute in k = lg n - 1into the above formula, we can simplify to get a closed-form expression for an upper bound. Doing this gives the following:
T(n) ≤ (nk / 2(k + 1)(k + 2) / 2 - 1) T(n / 2k) + Σi = 0k (ni / 2(i + 1)(i + 2) / 2 - 1)
= (nlg n - 1 / 2(lg n)(lg n + 1) / 2 - 1) T(2) + Σi = 0lg n - 1 (ni / 2(i + 1)(i + 2) / 2 - 1)
Oof, that's not pretty. However, it's not as bad as it might seem. Let's look at the denominator of the first term:
2(lg n)(lg n + 1) / 2 - 1
Using basic laws of exponents, we get that
2(lg n)(lg n + 1) / 2 - 1
= (2lg n)(lg n + 1) / 2 - 1
= n(lg n + 1) / 2 - 1
That's not so bad! So if we take the expression
nlg n - 1 / 2(lg n)(lg n + 1) / 2 - 1
We see that it simplifies as follows:
nlg n - 1 / 2(lg n)(lg n + 1) / 2 - 1
= nlg n - 1 / n(lg n + 1) / 2 - 1
= n(lg n - 1 - ((lg n + 1) / 2 - 1))
= n(lg n - (lg n + 1) / 2)
= n(2 lg n - lg n - 1) / 2)
= n(lg n - 1) / 2)
= nO(lg n)
Nifty! That leading term ends up being of the form nO(lg n)!
So what about the rest? Well, we have this awful summation to contend with:
Σi = 0lg n - 1 (ni / 2(i + 1)(i + 2) / 2 - 1)
Fortunately, one thing we can note is that
ni / 2(i + 1)(i + 2) / 2 - 1
≤ ni / 2(i + 1)(i + 2)
≤ ni / 2(i + 1)
≤ ni / 2i
= (n / 2)i
So if all we want to do is upper-bound the summation, we can do so by using this much simpler sum:
Σi = 0lg n - 1 (n / 2)i
Since n / 2 > 0, this sum is in turn upper bounded by
Σi = 0lg n - 1 (n / 2)lg n - 1
= (lg n)(nlg n - 1) / (2lg n - 1)
= (2 lg n)(nlg n - 1) / n
= (2 lg n)nlg n - 2
= nO(lg n)
And bingo, the second term in that awful sum is also nO(lg n)! This means that we have that T(n) = nO(log n). This bound isn't necessarily tight, but it does show that the the recurrence is definitely sub-exponential, since nO(log n) grows more slowly than any exponential function!
Hope this helps!