4

I know what is O(n) notation and I also understand how I can get notations like O(n), O(n2), ....

  • O(n) means I have to get through sequence once
  • O(n2) means that I have two nested cycles traversing sequence

But how do I get log(N)?

P.S.: I know the Collections API and some classes that have O(n log n) traversal time. I need a simple explanation.

Joachim Sauer
  • 302,674
  • 57
  • 556
  • 614
user590444
  • 4,252
  • 8
  • 36
  • 43

3 Answers3

12

lg N comes about in divide-and-conquer algorithms, where you iteratively or recursively skip half of the data in a collection of N items. Binary search is the quintessential example. The insert, lookup and delete operations on binary search trees are also O(lg N).

Iteratively discarding half the elements is, in an intuitive sense, the inverse of iteratively doubling the number of elements. Doubling would produce O(2^N) elements in N iterations. Note that the binary logarithm of N is the inverse of raising two to the power N.

Sorting an array can be done in O(N lg N) by algorithms such as mergesort, but also conceptually simpler: iterate over the array and put the elements in a binary search tree. Each insert takes O(lg N) and there are N of them, so the algorithm runs in O(N lg N).

(The sort-by-BST even has worst-case O(N lg N) complexity if the tree is balanced.)

Fred Foo
  • 355,277
  • 75
  • 744
  • 836
  • To be exact "comparision sorts" can only be done in O(n Log n), if you throw out that limitation, you can sort an array in O(n). – Voo Mar 20 '11 at 19:17
3

Best example N*log(N) in my opinion would be to look at comparison based search:

Lets take merge-sort as an example.

The algorithm takes an array of elements, splits it by 2 recursively until it get an array of length 2, then merges back the pieces at O(n) each merging sorted arrays is easier.

Ok, that was a nutshell description, now lets see what it looks like:

  [ 1 2 3 4 5 6 7 8 ]  
           |  
           /\  
  [1 2 3 4]  [ 5 6 7 8 ]
      |             |
      /\            /\
[1 2 ] [ 3 4]   [5 6] [7 8]

What I hope you can see here, that the depth of the recursion is log(n) (Because it's a tree that has 2 branches... so it's log(n) with base 2)
But in each level of the tree there's O(n) operations (because we are actively re-arranging n object).

So there a N*log(N) complexity here.

Most algorithms that have Log(N) complexity get that from a tree-based algorithm.
There are a few that are just probabilistic log(n) (which means that after a long math computation you get that in average it would be something like log(n))

Community
  • 1
  • 1
Yochai Timmer
  • 48,127
  • 24
  • 147
  • 185
2

A typical algorithm with complexity O(log n) is binary search in a sorted array. Instead of stepping through each element one by one O(n) you iteratively split you array in half and only look in the half where you know your element can be.

Albin Sunnanbo
  • 46,430
  • 8
  • 69
  • 108