So I just started learning about Asymptotic bounds for an algorithm
Question:
What can we say about theta of a function if for the algorithm we find different lower and upper bounds?? (say omega(n) and O(n^2)). Or rather what can we say about tightness of such an algorithm?
The book which I read says Theta is for same upper and lower bounds of the function.
What about in this case?

- 3
- 3
-
Assuming those lower and upper bounds are to be 'tight' – supertoniefy Mar 17 '13 at 14:42
2 Answers
I don't think you can say anything, in that case.
The definition of Θ(f(n))
is:
A function is
Θ(f(n))
if and only if it isΩ(f(n))
andO(f(n))
.
For some pathological function that exhibits those behaviors, such as oscillating between n
and n^2
, it wouldn't be defined.
Example:
f(x) = n if n is odd
n^2 if n is even
Your bounds Ω(n)
and O(n^2)
would be tight on this, but Θ(f(n))
is not defined for any function.

- 1
- 1

- 35,740
- 23
- 143
- 224
-
So in practical, an average case running time for such an algorithm would be preferred? – supertoniefy Mar 17 '13 at 14:56
-
There's a distinction between bounding a general function and the running time of an algorithm. Most algorithms don't exhibit this type of behavior - as far as I can tell, all expected or worst-case running times have big-Theta bounds. – Andrew Mao Mar 18 '13 at 01:59
Just for a bit of practicality, one algorithm that is not in Θ(f(n))
for any f(n)
would be insertion sort. It runs in Ω(n)
since for a list that is already sorted, you only need one operation for the insert in every step, but it runs in O(n^2)
in the general case. Constructing functions that oscillate or are non-continuous otherwise usually is done more for didactic purposes, but in my experience such functions rarely, if ever, appear with actual algorithms.
Regarding tightness, I only ever heard that in this context with reference to the upper and lower bounds proposed for algorithms. Again regarding the example of insertion sort, the given bounds are tight in the sense that there are instances of the problem that actually can be done in time linear in their size (the lower bound) and other instances of the problem that will not execute in time less than quadratic in their size. Bounds that are valid, but not tight for insertion sort would be Ω(1)
since you can't sort lists of arbitrary size in constant time, and O(n^3)
because you can always sort a list of n
elements in strictly O(n^2)
time, which is an order of magnitude less, so you can certainly do it in O(n^3)
. What bounds are for is to give us a crude idea of what we can expect as performance of our algorithms so we get an idea of how efficient our solutions are; tight bounds are the most desirable, since they both give us that crude idea and that idea is optimal in the sense that there are extreme cases (which sometimes are also the general case) where we actually need all the complexity the bound allows.
The average case complexity is not a bound; it "only" describes how efficient an algorithm is "in most cases"; take for example quick sort which has a best-case complexity of Ω(n)
, a worst case complexity of O(n^2)
and an average case complexity of O(n log n)
. This tells us that for almost all cases, quick sort is as fast as sorting gets in general (i.e. the average case complexity), while there are instances of the problem that it solves faster than that (best case complexity -> lower bound) and also instances of the problem that take quick sort longer to solve than that (worst case complexity -> upper bound).

- 3,869
- 2
- 25
- 46