It looks like the lower bound is pretty good, so I tried to proof that the upper bound is O(log n / log log n)
.
But let me first explain the other bounds (just for a better understanding).
TL;DR
T(n)
is in Θ(log n / log log n)
.
T(n) is in O(log n)
This can be seen by modifying n := n/log₂n
to n := n/2
.
It needs O(log₂ n)
steps until n ≤ 2
holds.
T(n) is in Ω(log n / log log n)
This can be seen by modifying n := n/log₂(n)
to n := n/m
, where m
is the initial value of log n
.
Solving the equation
n / (log n)x < 2
for x
leads us to
log n - x log log n < log 2
⇔ log n - log 2 < x log log n
⇔ (log n - log 2) / log log n < x
⇒ x ∈ Ω(log n / log log n)
Improving the upper bound: O(log n) → O(log n / log log n)
Now let us try to improve the upper bound. Instead of dividing n
by a fixed constant (namely 2
in the above proof) we divide n
as long by the initial value of log(n)/2
as the current value of log(n)
is bigger. To be more clearer have a look at the modified code:
int T₂(int n){
n_old = n;
for(int count=0; n>2 ;++count)
{
n = n / (log₂(n_old)/2);
if(log₂(n)) <= log₂(n_old)/2)
{
n_old = n;
}
}
return count;
}
The complexity of the function T₂
is clearly an upper bound for the function T
, since log₂(n_old)/2 < log₂(n)
holds for the whole time.
Now we need to know how many times we divide by each 1/2⋅log(n_old)
:
n / (log(sqrt(n)))x ≤ sqrt(n)
⇔ n / sqrt(n) ≤ log(sqrt(n))x
⇔ log(sqrt(n)) ≤ x log(log(sqrt(n)))
⇔ log(sqrt(n)) / log(log(sqrt(n))) ≤ x
So we get the recurrence formula T₂(n) = T(sqrt(n)) + O(log(sqrt(n)) / log(log(sqrt(n))))
.
Now we need to know how often this formula has to be expanded until n < 2
holds.
n2-x < 2
⇔ 2-x⋅log n < log 2
⇔ -x log 2 + log log n < log 2
⇔ log log n < log 2 + x log 2
⇔ log log n < (x + 1) log 2
So we need to expand the formula about log log n
times.
Now it gets a little bit harder. (Have also a look at the Mike_Dog's answer)
T₂(n) = T(sqrt(n)) + log(sqrt(n)) / log(log(sqrt(n)))
= Σk=1,...,log log n - 1 2-k⋅log(n) / log(2-k⋅log n))
= log(n) ⋅ Σk=1,...,log log n - 1 2-k / (-k + log log n))
(1) = log(n) ⋅ Σk=1,...,log log n - 1 2k - log log n / k
= log(n) ⋅ Σk=1,...,log log n - 1 2k ⋅ 2- log log n / k
= log(n) ⋅ Σk=1,...,log log n - 1 2k / (k ⋅ log n)
= Σk=1,...,log log n - 1 2k / k
In the line marked with (1) I reordered the sum.
So, at the end we "only" have to calculate Σk=1,...,t 2k / k
for t = log log n - 1
. At this point Maple solves this to
Σk=1,...,t 2k / k = -I⋅π - 2t⋅LerchPhi(2, 1, t) +2t/t
where I
is the imaginary unit and LerchPhi
is the Lerch transcendent. Since the result for the sum above is a real number for all relevant cases, we can just ignore all imaginary parts. The Lerch transcendent LerchPhi(2,1,t)
seems to be in O(-1/t)
, but I'm not 100% sure about it. Maybe someone will prove this.
Finally this results in
T₂(n) = -2t⋅O(-1/t) + 2t/t = O(2t/t) = O(log n / log log n)
All together we have T(n) ∈ Ω(log n / log log n)
and T(n) ∈ O(log n/ log log n)
,
so T(n) ∈ Θ(log n/ log log n)
holds. This result is also supported by your example data.
I hope this is understandable and it helps a little.