2

From how to calculate Bubble sort Time Complexity on stack overflow i come to know that complexity of worst case of bubble sort is Big Oh = n^2

But my confusion is the way it has been derived is

Big Oh = n + n - 1 + n - 2 ... + 1 = (n(n + 1))/2 = O(n²)

Now the equation (n(n + 1))/2 = O(n²) is contradictory.

If i take n=10 then (n*(n + 1))/2 = 55 then how come it is equal to n² which comes out to be 100 its actually close to its half so we can not say its is ~.

Kindly clear my doubt.

Community
  • 1
  • 1
T-Bag
  • 10,916
  • 3
  • 54
  • 118
  • 9
    You should learn the definition of **O** – Eran Jul 12 '16 at 13:53
  • From the question you're asking I get the sense that you're not familiar with big-O notation. Do a quick search on this site for more information and see if that helps you. I can confirm that this math is indeed correct. :-) – templatetypedef Jul 12 '16 at 13:54
  • "(n*(n + 1))/2 = O(n^2) is contradictory." It's wrong, not contradictory: `(n*(n + 1))/2` is *in*, not is equal to, `O(n^2)`. – Andy Turner Jul 12 '16 at 13:54
  • @LoneWolf If an user answered your question please also **accept** his answer ([Accepting Answers: How does it work?](https://meta.stackexchange.com/questions/5234/how-does-accepting-an-answer-work)). If not than please specify what remains unanswered, this is a really crucial part of StackOverflow, thank you very much. – Zabuzard Jul 27 '17 at 12:26

6 Answers6

3

From wikipedia:

f(x) = O(g(x)) as x goes to infinity if and only if there is a positive constant M such that for all sufficiently large values of x, the absolute value of f(x) is at most M multiplied by the absolute value of g(x).

So in your exaple there is such constant: if we take M = 3 then for all n>0 the inequality (n*(n + 1))/2 < 3*(n^2) stands.

Moreover, this definion says also that: O(n^2) = O(n^2/180) = O(n^2 + n) and so on.

Ohad Eytan
  • 8,114
  • 1
  • 22
  • 31
2

Now the equation (n*(n + 1))/2 = O(n^2) is contradictory.

No it isn't.

Because it isn't really an equation.

In fact, O(n^2) is really a short hand for an infinite set of functions f(n) which each individually have the property:

limit ( n -> infinity ) of f(n) <= C * n^2 ... for some constant C.

(There are more precise ways of stating this ...)

Intuitively, f(n) is a member of the set O(n^2) is telling us that f(n) grows in proportion to n^2 as n gets really big.


It is easy to prove that f(n) = (n*(n + 1))/2 is a member of the set O(n^2)

Informally, as n gets really big for this f(n), the (n^2)/2 term of the equation dominates and the n/2 disappears into insignificance.

Stephen C
  • 698,415
  • 94
  • 811
  • 1,216
0

The way that time complexity works is that you want to find how the function behaves at very large values. The value that you substitute N would not be exact, but it would be an approximation for large values. For example, if N = 1,000,000 and your time complexity is O(n+1), is 1,000,000 and 1,000,001 different?

The idea is you keep the term from those being added that grows the fastest asymptotically. So in n*(n+1)/2, you would keep n^2 since that grows the fastest.

MathBunny
  • 896
  • 1
  • 8
  • 16
0

O-notation is meant to show how the time requited grows with the number of inputs, as the number of inputs gets large. For example, let's suppose that a bit of code is O(n). If we were to double the input, we'd expect the run time to (roughly) double. If we triple it, we'd expect runtime to triple as well. Note that this would be true regardless of any factor that might exist in a hypothetical exact formula for the runtime of the code.

Similar can be said of O(n^2) : doubling will cause quadrupling, etc, etc.

So :

O(n^2) == O(1/2*n^2) == O(2*n^2) == 0(1000000000*n^2)

Here is a nice tutorial.

D Hydar
  • 502
  • 2
  • 8
0

To be precise, O(...) is a set of functions, not a number. Sometimes some guys are lazy and write x = O(y) instead of x \in O(y).

You can find the exact definition of the set O(y) in the column formal definition from this table: https://en.wikipedia.org/wiki/Big_O_notation#Family_of_Bachmann.E2.80.93Landau_notations

What does this informally mean? O(f(x)) contains functions that grow at about the same speed.
An example, if you have g(x) = n^2 + 5n - 7000 then it is in O(n^2) since n^2 is the dominant part of the function (compare to the exact definitions).

Also, constant factors can be removed. So, g(x) = 1000n^2 is also in O(n^2).

Thus, O(...) only is an indicator of which variables, and how much, something depends on. For example, there exists an input (it may be veeery big) where the function n^2 is bigger than 100000000 * n.

Because of that, an algorithm that has a time complexity of O(n^2) in general is not so good as one in O(n). But, which is very important, in practice it may be preferable as the hidden stuff in the second algorithm may be such big that the first algorithm is better for input sizes that occur in practice. However, there is a bound on the input size (it can be very large) where the second algorithm will get better, but they may not occur in practice though.

Zabuzard
  • 25,064
  • 8
  • 58
  • 82
0

The complexity of any sorting algo depends upon the comparisons. In bubble sort, we have seen there are N-1 Passes in total. In the first pass, N-1 comparisons are made to place the highest element in the correct position. Then in pass 2, the second highest element is placed in its position. Therefore to compute the complexity we have to compute the comparisons made by the algo~

Using basic discrete math~

f(n)= (n-1)+(n-2)+(n-3)+.....+ 3 + 2 + 1

f(n)= n(n-1)/2

f(n)= n^2 + O(n) ~ (The constans are ignored for deriving Complexities)

f(n)= O(n^2)

Therefore the complexity of Bubble Sort if O(n^2). It means the time required to execute bubble sort is proportional to n^2, where n is the total elements in the array.

Community
  • 1
  • 1
Jatin Chauhan
  • 327
  • 1
  • 2
  • 10