The recursion relation seems to give rise to a sub-exponential and super-linear computation-time, which means that any chosen base would work as an upper bound given a large enough n
.
Your choice of 2^n
is a good answer and possibly the one they were looking for in the book. It is a simple solution that is valid even for quite small values of n
. (Still, I understand why you are asking the question because it does grow much faster than T(n)
even for moderately large n
.)
Given T(1) = 1
(or some other constant) the recursion equation gives us the running time as follows for the first few values of n
.
T(1) = 1 n^1 = 2
T(2) = 4 n^2 = 4
T(3) = 11 n^3 = 8
T(4) = 19 n^4 = 16
T(5) = 35 n^5 = 32
T(6) = 52 n^6 = 64
T(7) = 78 n^7 = 128
T(8) = 105 n^8 = 256
T(9) = 149 n^9 = 512
We can see that the choice of 2^n
as an upper limit is valid for all values T(6)
and above.
If you want a lower bound than 2^n
you could choose a lower base (with the trade-off that it will only be valid for higher numbers of n
). But I must add that it will still be basically the same solution as the one you already have.
Any base larger than one would do but to be a bit more specific we could for example see that the recursion equation T(n) = T(n-1) + T(n/2) + n
is bounded by the equation T(n) = T(n-1) + T(n-2)
for n>5
.
This is the same recursion relation as for the Fibonacci sequence and following the steps in the answers to this question it has a computational complexity matching the golden ratio (1+sqrt(5))/2 = 1,618
to the power of n
.
Plotting the actual values we can see for which n
the value of T(n)
is bounded by ((1+sqrt(5))/2)^n
. From the figure it seems to be values n=13
and above.

All this said, I have thought a bit about approximating the running time with some sub-exponential function. It doesn't seem like it can be easily done and as I said I believe you have found the expected answer.