10

I'm confused, I thought that you use Big O for worst-case running time and Ω is for the best-case? Can someone please explain?

And isn't (lg n) the best-case? and (nlg n) is the worst case? Or am I misunderstanding something?

Show that the worst-case running time of Max-Heapify on a heap of size n is Ω(lg n). ( Hint: For a heap with n nodes, give node values that cause Max-Heapify to be called recursively at every node on a path from the root down to a leaf.)

Edit: no this is not homework. im practicing and this has an answer key buy im confused. http://www-scf.usc.edu/~csci303/cs303hw4solutions.pdf Problem 4(6.2 - 6)

Edit 2: So I misunderstood the question not about Big O and Ω?

jantristanmilan
  • 4,188
  • 14
  • 53
  • 69

3 Answers3

22

It is important to distinguish between the case and the bound.

Best, average, and worst are common cases of interest when analyzing algorithms.

Upper (O, o) and lower (Omega, omega), along with Theta, are common bounds on functions.

When we say "Algorithm X's worst-case time complexity is O(n)", we're saying that the function which represents Algorithm X's performance, when we restrict inputs to worst-case inputs, is asymptotically bounded from above by some linear function. You could speak of a lower bound on the worst-case input; or an upper or lower bound on the average, or best, case behavior.

Case != Bound. That said, "upper on the worst" and "lower on the best" are pretty sensible sorts of metrics... they provide absolute bounds on the performance of an algorithm. It doesn't mean we can't talk about other metrics.

Edit to respond to your updated question:

The question asks you to show that Omega(lg n) is a lower bound on the worst case behavior. In other words, when this algorithm does as much work as it can do for a class of inputs, the amount of work it does grows at least as fast as (lg n), asymptotically. So your steps are the following: (1) identify the worst case for the algorithm; (2) find a lower bound for the runtime of the algorithm on inputs belonging to the worst case.

Here's an illustration of the way this would look for linear search:

In the worst case of linear search, the target item is not in the list, and all items in the list must be examined to determine this. Therefore, a lower bound on the worst-case complexity of this algorithm is O(n).

Important to note: for lots of algorithms, the complexity for most cases will be bounded from above and below by a common set of functions. It's very common for the Theta bound to apply. So it might very well be the case that you won't get a different answer for Omega than you do for O, in any event.

Patrick87
  • 27,682
  • 3
  • 38
  • 73
  • I was under the impression that the sensible metric was "upper bound on the average case", except the average case tends to be more difficult to determine than the worst case. (Consider quicksort, where the worst case is pathological.) – millimoose Mar 14 '13 at 22:16
  • @millimoose I would rather say that the sensible bound is the upper bound on the worst case, and that Quicksort is the exception (in that we prefer to ignore the pathological worst case in order to justify using it over a true O(n log n) worst-case algorithm). That said, I have no references or credentials to back up the claim that "upper bound on worst case" is more "meaningful" than "upper bound on average case", so I can clarify that this is my assessment, not generally accepted. – Patrick87 Mar 14 '13 at 22:30
  • In any given program, are you constantly getting worst-case input? Usually not, so average-case complexity will be what determines your actual observed performance, for any algorithm, pathological or not. However, a corollary of this is that algorithms with a pathological worst-case complexity can lead to DOS exploits, like with the recent-ish Java [`parseDouble()`](http://www.exploringbinary.com/java-hangs-when-converting-2-2250738585072012e-308/) bug. Another would be that it makes sense to use algorithms appropriate for the sort of input data you have, which is why *timsort* is awesome. – millimoose Mar 14 '13 at 22:38
  • @millimoose In general, I'd agree that the metric on which one bases the choice of an algorithm should depend on the circumstances. For instance, some situations might call for an algorithm with good average-case performance and tolerate occasional bad performance, while others might call for more consistently scalable performance. In theoretical contexts, upper/worst and lower/best are more important because they say things about where problems live in complexity classes. – Patrick87 Mar 14 '13 at 23:01
  • 1
    @Patrick87 is "Therefore, a lower bound on the worst-case complexity of this algorithm is O(n)." a typo? Shouldn't it be: "Therefore, a lower bound on the worst-case complexity of this algorithm is Omega(n)?". – tmakino Jan 31 '18 at 20:15
-1

Actually, you use Big O for a function which grows faster than your worst-case complexity, and Ω for a function which grows more slowly than your worst-case complexity.

So here you are asked to prove that your worst case complexity is worse than lg(n).

Axelle Ziegler
  • 2,505
  • 17
  • 19
-1

O is the upper limit (i.e, worst case) Ω is the lower limit (i.e., best case)

The example is saying that in the worst input for max-heapify ( i guess the worst input is reverse-ordered input) the running time complexity must be (at least) lg n . Hence the Ω (lg n) since it is the lower limit on the execution complexity.

fabiim
  • 825
  • 2
  • 9
  • 18