3

I found a couple questions on geeksforgeeks.org that i can't seem to understand(#1 and #3). I was hoping someone could clarify the answers for me:

clarify whether true/valid or false

1.Time Complexity of QuickSort is Θ(n^2)

I answered true but it is false, why? If quicksort has a time complexity of O(n^2) and we know that Θ(g(n))={f(n) where c1*g(n) <= f(n) <=c2*g(n) and n >= n0} then doesn't that prove that it is true since c2*g(n) being the upper bound can equal f(n)?

2.Time Complexity of QuickSort is O(n^2) - true

3.Time complexity of all computer algorithms can be written as Ω(1)

This is true but i have no understanding of why this is true. A search algorithm can have a lower bound of Ω(1) assuming we find what we were looking for on the first element but how does this hold true for ALL computer algorithms such as insertion sort algorithm where the worst case is O(n^2)?

link: http://www.geeksforgeeks.org/analysis-of-algorithms-set-3asymptotic-notations/

user2644819
  • 1,787
  • 7
  • 28
  • 41
  • These aren't really programming questions as such. You'd be better asking on [cs.se] –  Jan 23 '14 at 08:35
  • 1
    The exercise questions don't make any sense. There is no point on taking about the complexity without telling under what analysis - and thus they are ambigious. Quicksort is Theta(n^2) worst case (and thus also O(n^2) and Omega(n^2)), and Theta(nlogn) average case (and thus also O(nlogn) and Omega(nlogn)). – amit Jan 23 '14 at 08:46
  • @amit I would partly disagree. You always can say that an algorithm is O(f(x)) if the worst case is O(f(x)). Similarly, you can always say that it's Omega(f(x)) if the best case is Omega(f(x)). And it's valid to ask whether an algorithm is Theta(f(x)), although rarely you can answer "yes". – Bartosz Klimek Jan 23 '14 at 09:00
  • So we can use big O, theta, and Omega each on either the worst case, average case, or best case? – user2644819 Jan 23 '14 at 09:05
  • @BartoszKlimek The point is, big Theta is a set of functions. There are quite a bunch algorithms with *average case* of `Theta(f(n))` or *worst case* of `Theta(f(n))` [quick sort, Theta(nlogn) average and Theta(nlogn) worst, for example]. The point is - you need to know what function you want to include in which set. "quicksort complexity" has at least 3 functions - one for worst case, one for best case, and one for average case. Obviously - once can think of more functions, and each belongs to different sets of asymptotical families. – amit Jan 23 '14 at 09:05
  • 1
    @user2644819 Yes, since they are all sets of functions. I tried to explain this issue in [this thread](http://stackoverflow.com/a/12338937/572670) – amit Jan 23 '14 at 09:06
  • I read your answer but i don't understand how can the merge sort be both O(nlogn) and O(n^2) under a worst case? I do understand the part where you said that the worst case for the merge sort algorithm is both O(nlogn) and Omega(nlogn) and therefore also theta(nlogn) – user2644819 Jan 23 '14 at 09:10
  • 1
    @user2644819 `O(nlogn)` is a *subset* of `O(n^2)`. Big O gives only 'upper bound', so if something has upper bound of `c*nlogn` - it will also have an upper bound of `c*n^2` (for some constant c). Recall that by definition, big O means: f(n) is in O(nlogn) if there are constants c,N such that for all n>N, f(n) <= c*nlogn. Since nlogn < n^2 - we get that f(n) <= c*n^2 as well, and by definition it means f(n) is in O(n^2). The bound for big O (and big Omega) does not have to be *tight*. – amit Jan 23 '14 at 09:13
  • I do not understand how can you say the worst case for a merge sort is O(n^2) when it is O(nlogn) even if O(nlogn) is a subset of O(n^2). O(nlogn) is a subset of O(n^4) but it does not mean the worst case for a merge sort is O(n^4)... i'm not sure i follow. – user2644819 Jan 23 '14 at 09:26
  • 1
    let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/45887/discussion-between-amit-and-user2644819) – amit Jan 23 '14 at 09:31
  • @amit :The exercise questions are precise and unambiguous.when you say quick sort is Big Theta(n^2) then it means for all input, the running time of the algorithm is equal to f(n)=n^2. As this is not true for all inputs of quicksort algorithm, this statement is false.But when you say "quick sort is Big O(n^2)", it implies for all input values, the running time of the algorithm is at most f(n)=n^2.In case of quicksort algorithm, best case time complexity is equals to g(n)=nlogn. As g(n)is no more than f(n) ,so the statement "quick sort is Big O(n^2)" is correct. – Tanmoy Banerjee Jan 23 '14 at 13:01
  • @tan I have never seen a definition in any literature that states 'algorithm complexity' is its complexity for 'any input'. Please refer me to such definition. – amit Jan 23 '14 at 16:32
  • @amit yes you are correct.'algorithm complexity' is not its complexity for 'any input'.it may be for best case input or worst case or average case.But when someone say "Time Complexity of QuickSort is Θ(n^2)", without specifying whether it is for best, worst or average case,then we have to consider that it is for best, worst and average case input values.otherwise it is meaningless. – Tanmoy Banerjee Jan 23 '14 at 19:05
  • @tan I agree with the meaningless part, not with the "best worst and average case input values". IMO, this question is just based on the common wrong assumption that big O refers to worst case and big Omega for best case. As said, AFAIK this is just a common mistake, and I have never seen it in any literature – amit Jan 23 '14 at 20:03

2 Answers2

4

Time Complexity of QuickSort is Θ(n^2)----This means for every value of n, time taken by the algorithm to produce the output is equal to a function which is f(n)=n^2.but we know this is not true for quick sort because we know for some input, running time of quick sort may be equal to a function which is g(n)=nlogn. so we need to specify if it is worst,best or average case.It is correct to say "Worst case time complexity of quicksort is Θ(n^2)".

"Time Complexity of QuickSort is O(n^2)"---this means for each input value of n,running time of the algorithm is at most a function which is f(n)=n^2.This implies there exist some input, for which the algorithm has a running time which may be less than f(n)=n^2.we know best case time complexity of quicksort is g(n)=nlogn and g(n)< f(n).As this statement covers all the cases so the statement is true.

Similarly it is correct to say "Time complexity of quicksort is Ω(nlogn)".because this means running time of the algorithm is at least nlogn, and n^2>nlogn.

"Time complexity of all computer algorithms can be written as Ω(1)"---here 1 represent constant time function.the above statement implies: to execute any computer algorithms we need a minimum constant time.which is correct for all computer algorithms.

Tanmoy Banerjee
  • 1,433
  • 3
  • 22
  • 31
1

The worst case scenario for QuickSort is O(n^2). But you expect it to run in O(n log n) time. Hence the running time of the algorithm varies per case and you cannot use the theta symbol to give the general running time of the algorithm.

And of course the lowerbound on the running time of any algorithm is constant time (Ω(1)). It doesn't have to reach this lower bound though but the algorithm should be run, and should have at least one operation.

invalid_id
  • 804
  • 8
  • 12
  • 2
    Big O, theta and omega has nothing to do with best/worst/average case. See [this thread](http://stackoverflow.com/q/10376740/572670). Big Theta/O/Omega can be applies to any analysis - this is just a set of functions. – amit Jan 23 '14 at 08:41
  • I agree that the notation is just mathematically showing bounds on functions. But when talking about runtime analysis `Ω` is used as a lower bound on the running time, hence best case. And `O` is used for upperbound, hence worst case. Now when the lowerbound == upperbound we can use Theta. So yes I made mistake and fixed it. – invalid_id Jan 23 '14 at 08:48