2

I have been doing some self-study on Big-O. I understand how to give examples of the following notations to algorithms:

O(N):

for(int i = 0; i < n; i++)
    sum++;

O(N^2):

for(int i = 0; i < n; i++)
    for( int j = 0; j < n; j++)
        sum++;

O(N^3):

for(int i = 0; i < n; i++)
    for( int j = 0; j < n * n; j++)
        sum++;

I have come across these notations which I don't quite comprehend. How do I give examples of these in terms of algorithms?

Maybe I should phrase it this way: write an algorithm which takes running time in proportion to:

  1. O((n^3)/4)
  2. log n^3
  3. O((log^2)n)+O(n)
  4. 4^n
  5. n^3/2
user471646
  • 29
  • 3
  • Did you SO? See http://stackoverflow.com/questions/107165/big-o-for-eight-year-olds and http://stackoverflow.com/questions/487258/plain-english-explanation-of-big-o and http://stackoverflow.com/search?q=Big+O ; – KMån Oct 25 '10 at 08:23
  • 1
    The question is usually how you find the Big O for a specific algorithm, – mohdajami Oct 25 '10 at 08:23
  • These are **examples** of O() algorithms. There's an infinite number of algorithms fitting any big O. Check @KMan's links. – Pontus Gagge Oct 25 '10 at 08:32
  • BTW: `O(log^2n)+O(n)` is not meaningful. O is a notation, not a function, so you cannot add it. Also, what is "log^2n" supposed to mean? – sleske Oct 25 '10 at 08:36
  • Sorry I edited some typos. I was looking at some examples at dreamincode.net/forums/topic/125427-determining-big-o-notation I am able to study some examples derived from notations, ie logN. I wondering if I could also derive examples from my given notations? I am also looking at his code for logN: for(int i = 0; i < n; i *= 2) { cout << i << endl; } Is this correct? Or is it actualy O(N)? – user471646 Oct 25 '10 at 09:08
  • Your phrasing is misleading. "How do I express these" should probably be "Can you give me examples of". – beldaz Oct 25 '10 at 09:15
  • To be really annoying the following function is `O(n^3/4)`, `O(logn^3)`, `O((logn)^2 + n)`, `O(4^n)` and `O(n^(3/2))`. Here goes: `function foo(int [] list) { return 42; }`. It's `O(1)` and thus also upper bounded by, for example, `O(n)`. @user471646 - I would practice the other way around, write/find an algorithm then try to find the best upper bound on its complexity. – Ishtar Oct 25 '10 at 10:19
  • I don't quite understand your function. Maybe someone could at least show me O(n^3/4), how will it deploy as a working algorithm example? – user471646 Oct 25 '10 at 11:57

2 Answers2

9

I fear you are misunderstanding "Big-O" notation.

A notation is not "expressed" as an algorithm. Rather, the Big-O notation describes a property of an algorithm.

So it's not "O(N) can be expressed as XXX", but rather "algorithm XXX has complexity of O(N)".

That said, it is quite reasonable to ask for examples of algorithms with a certain complexity; you already list some. To address your questions:

O(4^n) is the same as O(e^n), often written as O(exp(n)) (try to understand why it is the same). O(4^n) belongs to the class of algorithms with "exponential complexity" (EXPTIME). Many important problems in math/CS have exponential complexity (or nearly exponential complexity).

An algorithm with exponential complexity would be for example a naive solution to the discrete logarithm problem.

For the other complexities I cannot give an example, but you will probably find one by googling a bit.

sleske
  • 81,358
  • 34
  • 189
  • 227
  • I don't buy your O(a^n) = O(b^n) with a,b>1. While it is true that the basis is irrelevant when dealing with logarithms, this does not hold for exponential functions. Otherwise the entire EXPTIME class would be a set of problems with O(2^n) (or more conveniently O((1+epsilon)^n)) algorithms. – bitmask Oct 25 '10 at 09:01
  • 1
    @bitmask: Actually, it is. Wikipedia e.g. defines EXPTIME as "the set of all decision problems solvable by a deterministic Turing machine in O(2^p(n)) time" (with p(n) being a polynomial in n). – sleske Oct 25 '10 at 10:05
  • BTW, my original statement "O(4^n) is the same as O(e^n)" is of course false. What I had in mind (but very dimly) was that they are both in EXPTIME. Thanks for reminding me :-). – sleske Oct 25 '10 at 10:13
  • Yes, that is something different. O(2^p(n)) CONTAINS all O(a^n), as you can hide the different base (a) inside p. For instance, if p(n)=2n, then you have 2^(p(n)) = 4^n. But that does not imply that O(2^n) = O(4^n). – bitmask Oct 25 '10 at 10:34
-1

The algorithms you have given are strictly speaking not Big-O but Theta. Big-O is an asymptotic upper bound meaning that on some worse case input the running time will be the one given but isn't for all inputs, where as Theta is a tight bound meaning that the running time will always be that. Consider for instance a sorting algorithm that is given an already sorted list as input, or a search algorithm where the things to be found is the first element in a list (how does binarysearch compare to linear search in this case).

Matti Lyra
  • 12,828
  • 8
  • 49
  • 67
  • If we are going to be nit-picky. Any algorithm that is `Theta(f(N))` is *also* `O(f(N))`. So the OP is actually correct in saying they are Big-O – Stephen C Oct 25 '10 at 10:01
  • ??? Those are all both Big-O **and** Omega, implying Theta. "meaning that on some worse case input". Theta is just being more specific, the lower bound has been found/proven as well. It is not related to worst case. "running time will always be that" is strictly speaking not correct, better would be "the growth of rate is proportional with Theta". – Ishtar Oct 25 '10 at 10:08