5

I'm studying asymptotic notations from the book and I can't understand what the author means. I know that if f(n) = Θ(n^2) then f(n) = O(n^2). However, I understand from the author's words that for insertion sort function algorithm f(n) = Θ(n) and f(n)=O(n^2).

Why? Does the big omega or big theta change with different inputs?

He says that:

"The Θ(n^2) bound on the worst-case running time of insertion sort, however, does not imply a Θ(n^2) bound on the running time of insertion sort on every input. "

However it is different for big-oh notation. What does he mean? What is the difference between them?

I'm so confused. I'm copy pasting it below:

Since O-notation describes an upper bound, when we use it to bound the worst-case running time of an algorithm, we have a bound on the running time of the algorithm on every input. Thus, the O(n^2) bound on worst-case running time of insertion sort also applies to its running time on every input. The Θ(n^2) bound on the worst-case running time of insertion sort, however, does not imply a Θ(n^2) bound on the running time of insertion sort on every input. For example, when the input is already sorted, insertion sort runs in Θ(n) time.

hakre
  • 193,403
  • 52
  • 435
  • 836
oiyio
  • 5,219
  • 4
  • 42
  • 54

5 Answers5

4

Does the big omega or big theta change with different inputs?

Yes. To give a simpler example, consider linear search in an array from left to right. In the worst and average case, this algorithm takes f(n) = a × n/2 + b expected steps for some constants a and b. But when the left element is guaranteed to always hold the key you're looking for, it always takes a + b steps.

Since Θ denotes a strict bound, and Θ(n) != Θ(n²), it follows that the Θ for the two kinds of input is different.

EDIT: as for Θ and big-O being different on the same input, yes, that's perfectly possible. Consider the following (admittedly trivial) example.

When we set n to 5, then n = 5 and n < 6 are both true. But when we set n to 1, then n = 5 is false while n < 6 is still true.

Similarly, big-O is just an upper bound, just like < on numbers, while Θ is a strict bound like =.

(Actually, Θ is more like a < n < b for constants a and b, i.e. it defines something analogous to a range of numbers, but the principle is the same.)

Fred Foo
  • 355,277
  • 75
  • 744
  • 836
  • For a same function and same input , is it possible that the big-oh and big-theta is different? – oiyio Oct 10 '12 at 09:36
  • 1
    To elaborate, `Theta(f(n))` is actually the *intersection* of `O(f(n))` and `Omega(f(n))`. Everything that is `Theta(f(n))` is also `O(f(n))` - but not the other way around. – amit Oct 10 '12 at 10:07
  • in the link which David gave , a man says that : " Basically when we say an algorithm is of O(n), it's also O(n^2), O(n^1000000), O(2^n), ... but a Θ(n) algorithm is not Θ(n^2). " can this sentence be the exact answer of my question ? "O(n^2) applies to every answer" author says. Even if the best case is f(n)=O(1) , we can say f(n)=O(n^2) because worst case is O(n^2). However , since theta(n) is not theta(n^2) , we cannot say the same thing for theta unlike in big-og notation. Am i right? – oiyio Oct 10 '12 at 16:08
  • 1
    @user1308990: no. Θ vs. O is completely unrelated to best case vs. expected vs. worst case. You can say f(n) = O(1) even when f(n) = O(n²) just because n² > 1 for every n greater than some constant. – Fred Foo Oct 10 '12 at 17:33
0

Refer to CLRS edition 3 Page -44(Asymptotic notation,functions,and running times) It says -

Even when we use asymptotic notation to apply to the running time of an algorithm, we need to understand which running time we mean. Sometimes we are interested in the worst-case running time. Often, however, we wish to characterize the running time no matter what the input. In other words, we often wish to make a blanket statement that covers all inputs, not just the worst case.

Takings from the above para:

  • Worst case provides the atmost limit for the running time of an algorithm. Thus, the O(n^2) bound on worst-case running time of insertion sort also applies to its running time on every input.

  • But Theta(n^2) bound on the worst-case running time of insertion sort, however, does not imply Theta(n^2) bound on the running time of insertion sort on every input. Because best case running time of insertion sort yields Theta(n).(When the list is already sorted)

  • We usually write the worst case time complexity of an algorithm but when best case and average case come into accountability the time complexity varies according to these cases.

Satyendra Yadav
  • 156
  • 3
  • 14
0

In simple words, the running time of a programs is described as a function of its input size i.e. f(n).

The = is asymmetric, thus an+b=O(n) means f(n) belongs to set O(g(n)). So we can also say an+b=O(n^2) and its true because f(n) for some value of a,b and n belongs to set O(n^2).

Thus Big-Oh(O) only gives an upper bound or you can say the notation gives a blanket statement, which means all the inputs of a given input size are covered not just the worst case once. For example in case of insertion sort an array of size n in reverse order.

So n=O(n^2) is true but will be an abuse when defining worst case running time for an algorithm. As worst case running time gives an upper bound on the running time for any input. And as we all know that in case of insertion sort the running time will depend upon how the much sorted the input is in the given array of a fixed size. So if the array is sort the running will be linear.

So we need a tight asymptotic bound notation to describe our worst case, which is provided by Θ notation thus the worst case of insertion sort will be Θ(n^2) and best case will be Θ(n).

HarshitChopra
  • 41
  • 1
  • 3
-1

we have a bound on the running time of the algorithm on every input

It means that if there is a set of inputs with running time n^2 while other have less, then the algorithm is O(n^2).

The Θ(n^2) bound on the worst-case running time of insertion sort, however, does not imply a Θ(n^2) bound on the running time of insertion sort on every input

He is saying that the converse is not true. If an algorithm is O(n^2), it doesnt not mean every single input will run with quadratic time.

UmNyobe
  • 22,539
  • 9
  • 61
  • 90
  • 1
    Big-O is not the same as "worst case" as you are implying. If there is a set of inputs with running time n^2 while others have less, then the algorithm takes O(n^2) and Theta(n^2) *in the worst case*. Just saying "the algorithm is O(n^2)" is meaningless. – interjay Oct 10 '12 at 09:53
  • It *is* Theta(n^2) in the worst case. If you are looking at the worst case than the other cases aren't a factor. Even the part quoted from the book by the OP gives an example of exactly such a case. – interjay Oct 10 '12 at 12:48
  • "Theta(n^2) in the worst case" implies the complexity for any input in worst case is quadratic, no less and no more. – UmNyobe Oct 10 '12 at 12:52
  • remember, in my word i never talked about worst case or average case or best case, i talked about running time. So I am not mixing Big-O and worse case. – UmNyobe Oct 10 '12 at 12:54
-4

My academic theory on the insertion sort algorithm is far away in the past, but from what I understand in your copy-paste :

big-O notation always describe the worst case, but big-Theta describe some sort of average for typical data.

take a look at this : What is the difference between Θ(n) and O(n)?

Community
  • 1
  • 1
David
  • 270
  • 1
  • 11
  • 3
    Nope, that's not what Θ and big-O mean. Θ is a strict bound, O an upper bound. Both can be applied to average, worst-case, best-case or amortized analysis. – Fred Foo Oct 10 '12 at 09:30