215

I'm really confused about the differences between big O, big Omega, and big Theta notation.

I understand that big O is the upper bound and big Omega is the lower bound, but what exactly does big Ө (theta) represent?

I have read that it means tight bound, but what does that mean?

nbro
  • 15,395
  • 32
  • 113
  • 196
user1364768
  • 2,213
  • 3
  • 16
  • 8
  • Possible duplicate of [Difference between lower bound and tight bound?](https://stackoverflow.com/questions/464078/difference-between-lower-bound-and-tight-bound) – Damjan Pavlica May 13 '18 at 21:40

7 Answers7

377

First let's understand what big O, big Theta and big Omega are. They are all sets of functions.

Big O is giving upper asymptotic bound, while big Omega is giving a lower bound. Big Theta gives both.

Everything that is Ө(f(n)) is also O(f(n)), but not the other way around.
T(n) is said to be in Ө(f(n)) if it is both in O(f(n)) and in Omega(f(n)).
In sets terminology, Ө(f(n)) is the intersection of O(f(n)) and Omega(f(n))

For example, merge sort worst case is both O(n*log(n)) and Omega(n*log(n)) - and thus is also Ө(n*log(n)), but it is also O(n^2), since n^2 is asymptotically "bigger" than it. However, it is not Ө(n^2), Since the algorithm is not Omega(n^2).

A bit deeper mathematic explanation

O(n) is asymptotic upper bound. If T(n) is O(f(n)), it means that from a certain n0, there is a constant C such that T(n) <= C * f(n). On the other hand, big-Omega says there is a constant C2 such that T(n) >= C2 * f(n))).

Do not confuse!

Not to be confused with worst, best and average cases analysis: all three (Omega, O, Theta) notation are not related to the best, worst and average cases analysis of algorithms. Each one of these can be applied to each analysis.

We usually use it to analyze complexity of algorithms (like the merge sort example above). When we say "Algorithm A is O(f(n))", what we really mean is "The algorithms complexity under the worst1 case analysis is O(f(n))" - meaning - it scales "similar" (or formally, not worse than) the function f(n).

Why we care for the asymptotic bound of an algorithm?

Well, there are many reasons for it, but I believe the most important of them are:

  1. It is much harder to determine the exact complexity function, thus we "compromise" on the big-O/big-Theta notations, which are informative enough theoretically.
  2. The exact number of ops is also platform dependent. For example, if we have a vector (list) of 16 numbers. How much ops will it take? The answer is: it depends. Some CPUs allow vector additions, while other don't, so the answer varies between different implementations and different machines, which is an undesired property. The big-O notation however is much more constant between machines and implementations.

To demonstrate this issue, have a look at the following graphs: enter image description here

It is clear that f(n) = 2*n is "worse" than f(n) = n. But the difference is not quite as drastic as it is from the other function. We can see that f(n)=logn quickly getting much lower than the other functions, and f(n) = n^2 is quickly getting much higher than the others.
So - because of the reasons above, we "ignore" the constant factors (2* in the graphs example), and take only the big-O notation.

In the above example, f(n)=n, f(n)=2*n will both be in O(n) and in Omega(n) - and thus will also be in Theta(n).
On the other hand - f(n)=logn will be in O(n) (it is "better" than f(n)=n), but will NOT be in Omega(n) - and thus will also NOT be in Theta(n).
Symmetrically, f(n)=n^2 will be in Omega(n), but NOT in O(n), and thus - is also NOT Theta(n).


1Usually, though not always. when the analysis class (worst, average and best) is missing, we really mean the worst case.

Ziyaddin Sadigov
  • 8,893
  • 13
  • 34
  • 41
amit
  • 175,853
  • 27
  • 231
  • 333
  • +1 for the nice explanation, I have a confusion ,you have written in last line that ,, f(n)=n^2 will be in Omega(n). but not in O(n) .and thus is also not Theta(n) >how? – kTiwari Sep 10 '12 at 04:24
  • 4
    @krishnaChandra: `f(n) = n^2` is asymptotically stronger then `n`, and thus is Omega(n). However it is not O(n) (because for large `n` values, it is bigger then `c*n`, for all `n`). Since we said Theta(n) is the intersection of O(n) and Omega(n), since it is not O(n), it cannot be Theta(n) as well. – amit Sep 10 '12 at 05:04
  • 12
    It's great to see someone explain how big-O notation isn't related to the best/worst case running time of an algorithm. There are so many websites that come up when I google the topic that say O(T(n)) means the worse case running time. – Will Sewell Feb 21 '13 at 09:46
  • 1
    @almel It's 2*n (2n, two times n) not 2^n – amit Sep 08 '14 at 22:04
  • really usefull explanations to explain theta in a simple way – Ninja Sep 18 '14 at 05:04
  • "O(n) is asymptotic lower bound" - did you mean Omega? O(n) is the upper bound. – whonoes Nov 03 '14 at 20:37
  • @whonoes thank you, seems like it got confused during edits or something (or maybe I just mispelled from the first place). fixed that. Thanks for letting me know – amit Nov 06 '14 at 09:07
  • for n^2, you can use the tag `2`. Anyway, wonderful answer. – rpax Apr 20 '15 at 05:43
  • @Xenomorph That was supposed to be "worse". thanks. Fixed, along with some typos. – amit May 25 '15 at 19:10
  • A beautiful explanation! Can we say that **1.** `O(f(n))` represents the behavior of an algorithm when `n` (the number of input elements) `tends to infinity`. ** 2.** `Omega(f(n))` represents the behavior of an algorithm when `n tends to zero`. **3.** `Theta(f(n))` represents the actual behavior of the algorithm for `0 < n < infinity` ? *Please correct me if I am wrong.* – Vishal K Jun 12 '16 at 06:53
  • 6
    @VishalK 1. Big O is the **upper** bound as _n_ tends to infinity. 2. Omega is the **lower** bound as _n_ tends to infinity. 3. Theta is both the **upper and lower** bound as _n_ tends to infinity. Note that all bounds are only valid "as _n_ tends to infinity", because the bounds do not hold for low values of _n_ (less than _n0_). The bounds hold for all _n_ ≥ _n0_, but not below _n0_ where lower order terms become dominant. – bain Dec 04 '16 at 11:28
  • @bain : Thank you so much for clarifying my doubts. – Vishal K Dec 05 '16 at 05:51
  • I still don't understand how O and Omega are not worst and best case. Could you provide an example with best, worst and average cases of an algorithm and provide their corresponding O, Theta and Omega? – pavel_orekhov Nov 12 '19 at 02:30
  • Is O just about being not slower? And Omega just about being not faster? So, in the worst case, merge sort is not faster that nlogn and is not slower than nlogn? So, can we say that in the worst case it's exactly nlogn then? Or what does Theta mean in this case? – pavel_orekhov Nov 12 '19 at 02:44
  • What is an example of an algorithm whose Theta is different from its O? Does this kind of algorithm even exist? – pavel_orekhov Nov 12 '19 at 02:46
  • 1
    @hey_you Read the answer again. big O,Theta,Omega are for functions, not algorithms. Merge sort is Omega(n) worst case. It is also O(n^2) best case. It is also Theta (nlogn) worst case. Basically, for each analysis (worst/best/average/...) you have a complexity function `T_best(n), T_worst(n), T_average(n)`. They do not have to be identical (and mostly, they are not). O/Omega/Theta can be applied to any of them independently. – amit Nov 14 '19 at 22:26
  • @amit, so, O(n log n) - not faster than n log n, Omega(n log n) - not slower than n log n, Theta(n log n) - exactly n log n? Is that the meaning behind these functions? – pavel_orekhov Nov 14 '19 at 23:42
  • 1
    @hey_you, as amit said in his answer that "O(n) is asymptotic upper bound. If T(n) is O(f(n)), it means that from a certain n0, there is a constant C such that T(n) **<=** C * f(n). On the other hand, big-Omega says there is a constant C2 such that T(n) **>=** C2 * f(n))).", so you are right. Please note that in case of small-o and small-omega functions, your statement isn't true. For more reading about the specified functions, please check this [thread](https://stackoverflow.com/questions/1364444/difference-between-big-o-and-little-o-notation) – Perspicacious Jun 21 '20 at 09:45
  • Finally an understandable answer. – xaled Aug 23 '23 at 11:09
105

It means that the algorithm is both big-O and big-Omega in the given function.

For example, if it is Ө(n), then there is some constant k, such that your function (run-time, whatever), is larger than n*k for sufficiently large n, and some other constant K such that your function is smaller than n*K for sufficiently large n.

In other words, for sufficiently large n, it is sandwiched between two linear functions :

For k < K and n sufficiently large, n*k < f(n) < n*K

Pierre-Antoine Guillaume
  • 1,047
  • 1
  • 12
  • 28
happydave
  • 7,127
  • 1
  • 26
  • 25
  • It does not, those variables are a bit confusing, they are unrelated. – Aaron Robeson Sep 25 '18 at 21:10
  • @committedandroider No, they are lowercase and uppercase thus different, he's using typical mathematical style in which two "similar" (but not related in any way here) variables use big and small case. – Santropedro Jul 07 '19 at 21:08
15

Theta(n): A function f(n) belongs to Theta(g(n)), if there exists positive constants c1 and c2 such that f(n) can be sandwiched between c1(g(n)) and c2(g(n)). i.e it gives both upper and as well as lower bound.

Theta(g(n)) = { f(n) : there exists positive constants c1,c2 and n1 such that 0<=c1(g(n))<=f(n)<=c2(g(n)) for all n>=n1 }

when we say f(n)=c2(g(n)) or f(n)=c1(g(n)) it represents asymptotically tight bound.

O(n): It gives only upper bound (may or may not be tight)

O(g(n)) = { f(n) : there exists positive constants c and n1 such that 0<=f(n)<=cg(n) for all n>=n1}

ex: The bound 2*(n^2) = O(n^2) is asymptotically tight, whereas the bound 2*n = O(n^2) is not asymptotically tight.

o(n): It gives only upper bound (never a tight bound)

the notable difference between O(n) & o(n) is f(n) is less than cg(n) for all n>=n1 but not equal as in O(n).

ex: 2*n = o(n^2), but 2*(n^2) != o(n^2)

Mysticial
  • 464,885
  • 45
  • 335
  • 332
thisizkp
  • 1,757
  • 20
  • 21
  • 1
    You didn't mention big Omega, which refers to the lower-bound. Otherwise, very nice first answer and welcome! – bohney Oct 06 '12 at 17:01
  • 1
    i liked the way he framed the definition of Theta(n). Upvoted! – user720694 Oct 20 '13 at 14:48
  • Is it right to think of theta as the 'average' time for a function? I keep hearing people refer to it as the average but I am not sure if the fact it is simply constrained by an upper and lower boundary really means its an average. – berimbolo Nov 26 '20 at 22:23
7

I hope this is what you may want to find in the classical CLRS(page 66): enter image description here

Lerner Zhang
  • 6,184
  • 2
  • 49
  • 66
2

Big Theta notation:

Nothing to mess up buddy!!

If we have a positive valued functions f(n) and g(n) takes a positive valued argument n then ϴ(g(n)) defined as {f(n):there exist constants c1,c2 and n1 for all n>=n1}

where c1 g(n)<=f(n)<=c2 g(n)

Let's take an example:

let f(n)=5n^2+2n+1

g(n)=n^2

c1=5 and c2=8 and n1=1

Among all the notations ,ϴ notation gives the best intuition about the rate of growth of function because it gives us a tight bound unlike big-oh and big -omega which gives the upper and lower bounds respectively.

ϴ tells us that g(n) is as close as f(n),rate of growth of g(n) is as close to the rate of growth of f(n) as possible.

see the image to get a better intuition

2

First of All Theory

  1. Big O = Upper Limit O(n)

  2. Theta = Order Function - theta(n)

  3. Omega = Q-Notation(Lower Limit) Q(n)

Why People Are so Confused?

In many Blogs & Books How this Statement is emphasised is Like

"This is Big O(n^3)" etc.

and people often Confuse like weather

O(n) == theta(n) == Q(n)

But What Worth keeping in mind is They Are Just Mathematical Function With Names O, Theta & Omega

so they have same General Formula of Polynomial,

Let,

f(n) = 2n4 + 100n2 + 10n + 50 then,

g(n) = n4, So g(n) is Function which Take function as Input and returns Variable with Biggerst Power,

Same f(n) & g(n) for Below all explainations

Big O(n) - Provides Upper Bound

Big O(n4) = 3n4, Because 3n4 > 2n4

3n4 is value of Big O(n4) Just like f(x) = 3x

n4 is playing a role of x here so,

Replacing n4 with x'so, Big O(x') = 2x', Now we both are happy General Concept is

So 0 ≤ f(n) ≤ O(x')

O(x') = cg(n) = 3n4

Putting Value,

0 ≤ 2n4 + 100n2 + 10n + 50 ≤ 3n4

3n4 is our Upper Bound

Big Omega(n) - Provides Lower Bound

Theta(n4) = cg(n) = 2n4 Because 2n4 ≤ Our Example f(n)

2n4 is Value of Theta(n4)

so, 0 ≤ cg(n) ≤ f(n)

0 ≤ 2n4 ≤ 2n4 + 100n2 + 10n + 50

2n4 is our Lower Bound

Theta(n) - Provides Tight Bound

This is Calculated to find out that weather lower Bound is similar to Upper bound,

Case 1). Upper Bound is Similar to Lower Bound

if Upper Bound is Similar to Lower Bound, The Average Case is Similar

Example, 2n4 ≤ f(x) ≤ 2n4,
Then Theta(n) = 2n4

Case 2). if Upper Bound is not Similar to Lower Bound

In this case, Theta(n) is not fixed but Theta(n) is the set of functions with the same order of growth as g(n).

Example 2n4 ≤ f(x) ≤ 3n4, This is Our Default Case,
Then, Theta(n) = c'n4, is a set of functions with 2 ≤ c' ≤ 3

Hope This Explained!!

Ziyaddin Sadigov
  • 8,893
  • 13
  • 34
  • 41
Akash
  • 2,795
  • 1
  • 7
  • 11
0

I am not sure why there is no short simple answer explaining big theta in plain english (seems like that was the question) so here it is

Big Theta is the range of values or the exact value (if big O and big Omega are equal) within which the operations needed for a function will grow

Yehuda Schwartz
  • 3,378
  • 3
  • 29
  • 38