0

How is Big Oh calculated for sorting algorithms? I have written a program to sort a deck of cards ( n = 52 ) using Selection sort, Merge sort and Insertion sort. For each sorting algorithm, I have to calculate the computation cost. How do I do it?

Now I would be honest enough to admit that it is part of my homework but I am just asking for some help. I am new to the Big Oh notation and went through a couple of websites but couldn't figure it out in terms of sorting algorithms.

Can someone atleast give me hints?

I just referred Wikipedia on Selection sort, Insertion sort & Merge sort.

Insertion Sort: Best Case: O(n), Worst Case and Average case: O(n^2)

Selection Sort: O(n^2) for all three cases

Merge Sort: O(nlog n) for all three cases

Is that true?

Komal Rathi
  • 4,164
  • 13
  • 60
  • 98

1 Answers1

2

What you're searching for is called complexity, in your case probably in terms of runtime. "Big O" just stands for an upper boundary, whereas "small o" would be a lower one, average is "Theta". They represent functions of time steps used for execution of a program, independent of scaling and addition.

Example: If your sorting algorithm iterates over every list (size n) item once (e.g. with a for loop) and on every step loops again over every other (doing l steps there), then you will need about l*n*n steps plus some k steps around for preparation etc. This will result in a complexity of O(l*n*n + k), which is equal to O(n*n) = O(n^2), because l and k are constant.

More and deeper information: http://en.wikipedia.org/wiki/Time_complexity

Cedric Reichenbach
  • 8,970
  • 6
  • 54
  • 89