1

I have been through this Big-Oh explanation, understanding the complexity of two loops, difference between big theta and big-oh and also through this question.

I understand that we cannot say that Big-oh is used as the worst case, Omega as Best case and theta as average case. Big-oh has its own best, worst and average cases. But how we find out that any specific algorithm belongs from Big-oh, Big-theta or Big-Omega. or how we can check that if any algorithm belongs from all of these.

AHF
  • 1,070
  • 2
  • 15
  • 47
  • Theta bounds are much harder to prove than Big-O, so are rarely used even though they would generally be better. Omega bounds are generally less useful as they are lower bounds and we usually care more about upper bounds. – Chris Dodd Nov 15 '19 at 20:06
  • Does every algorithm contain these three? Big-oh, Big-theta, and Big-Omega? – AHF Nov 15 '19 at 20:08
  • Algorithms don't "contain" any of those bounds. You may be able to prove various bounds about any given algorithm, but you don't know for sure until its been proven. – Chris Dodd Nov 15 '19 at 20:12
  • There is no general method. Statements about asymptotic behaviour of algorithms are just mathematical statements. There is no general way to prove a statement (i.e. to turn one into a theorem), – n. m. could be an AI Nov 15 '19 at 20:49
  • But on which base we say that insertion sort's worst-case scenario is theta(n^2) why not Big-oh? or Can we say that is Big-oh(n^2)? – AHF Nov 16 '19 at 15:29
  • 1
    "insertion sort's worst-case scenario is theta(n^2)" is a theorem that needs to be proven. "insertion sort's worst-case scenario is O(n^2)" is another theorem. It just so happens that it is a trivial corollary of the previous theorem. The theta notation provides both upper and lower limits while O provides only an upper limit. Thus the statement with the theta is a strictly more powerful one. – n. m. could be an AI Nov 17 '19 at 09:49

1 Answers1

2

A function f(n) is Big-Oh of a function g(n), written f(n) = O(g(n)), if there exist a positive constant c and a natural number n0 such that for n > n0, f(n) <= c * g(n). A function f(n) is Big-Omega of g(n), written f(n) = Omega(g(n)), if and only if g(n) = O(f(n)). A function f(n) is Theta of a function g(n), written f(n) = Theta(g(n)), if and only if f(n) = O(g(n)) and f(n) = Omega(g(n)).

To prove any of the free, you do it by showing some function(s) are Big-Oh of some other functions. To show that one function is Big-Oh of another is a difficult problem in the general case. Any form of mathematical proof may be helpful. Induction proofs in conjunction with intuition for the base cases are not uncommon. Basically, guess at values for c and n0 and see if they work. Other options involve choosing one of the two and working out a reasonable value for another.

Note that a function may not be Big-Theta of any other function, if its tightest bounds from above and below are functions with different asymptotic rates of growth. However, I think it's usually a safe bet that most functions are going to be Big-Oh of something reasonably uncomplicated, and all functions typically looked at from this perspective are at least constant-time in the best case - Omega(1).

Patrick87
  • 27,682
  • 3
  • 38
  • 73
  • For insertion sort we say its worst case is theta(n^2) why we not checking or saying that its worst case is Big-oh(n^2), what makes us to consider theta and not Big-oh – AHF Nov 16 '19 at 15:27
  • 1
    You can say either; but Theta is a stronger statement than O. It's like saying something is an orange vs. merely saying it's a fruit; it's more specific to say it's an orange, and less specific to say it's a fruit. The upside is that because O(f) is less specific than Theta(f), then it's easier to prove that an algorithm is O(f) than that it's Theta(f). – kaya3 Nov 17 '19 at 18:44