9

I need help understanding/doing Big O Notation. I understand the purpose of it, I just don't know how to "determine the complexity given a piece of code".

Determine the Big O notation for each of the following

a.

n=6;
cout<<n<<endl;

b.

n=16;
for (i=0; i<n; i++)
    cout<<i<<endl;

c.

i=6;
n=23;
while (i<n) {
    cout<<i-6<<endl;
    i++;
}

d.

int a[ ] = {1, 3, 5, 7, 9, 11, 13, 15, 17, 19};
n=10;
for (i=0; i<n; i++)
    a[i]=a[i]*2;
for (i=9; i>=0; i--)
    cout<<a[i]<<endl;

e.

sum=0;
n=6;
k=pow(2,n);
for (i=0;i<k;i++)
    sum=sum+k;
Nikolay Kostov
  • 16,433
  • 23
  • 85
  • 123
user1074051
  • 103
  • 1
  • 4
  • 7
  • 4
    I won't vote to close because I think this can salvaged to a decent question, even if it is covered ground. But, you need to post your attempts at these questions first. Tell us what you think the answers are, and why you think they're that way. – corsiKa Nov 30 '11 at 19:12
  • When all data are hardcoded it always be IMO O(1). If there is some variable then O can be different. Also read Cormen's 'Introduction to algorithms', one of the first sections are about O, Omega and Theta notation. – Hauleth Nov 30 '11 at 19:13
  • i really have no clue where to start, I didn't even see my professor go over this material and have the read the book and it really makes no sense to me. I did some searching and see these complex formulas but havent seen any questions that look like the ones my professor gave me – user1074051 Nov 30 '11 at 19:14
  • Is question `d)` two nested for-loops? Because in C++ with no `{}` only the next line is taken to be part of the loop...that would change the answer. Please restructure your code for better answers :) – mevatron Nov 30 '11 at 19:35
  • 1
    @user1074051 - Your test is tomorrow and you have "no clue?" Good luck =) – 5StringRyan Nov 30 '11 at 20:47
  • Technically, because all of these pieces of code specify the values of their variables, they all run in constant time and are thus O(1). The answers below, however, are truer to the intent of the question. – Keith Irwin Jan 06 '14 at 08:42

5 Answers5

7

Big O indicates the order of the complexity of your algorithm.

Basic things :

  • This complexity is measured regarding to the entry size
  • You choose a unit operation (usually affectation or comparison)
  • You count how much time this operation is called
  • A constant term or constant factor is usually ignored when using complexity so if the number of operation is 3*n^3 + 12 it's simplified to n^3 also marked O(n^3)

a.) Will just run once, no loop, complexity is trivial here O(1)

b.) Call n times in the loop: O(n)

c.) Here we choose to analyze n (because it's usually the incrementing variable in an algorithm). The number of calls is n - 6 so this is O(n).

d.) Let's suppose here that 10 (n) is the size of your array and nine (i) this size minus one. For each value to n, we have to go from 0 to n then n-1 to 0. n * (n-1) operations, technically: O(n * 2) which some people approximate as O(n). Both are called Linear Time, what is different is the slope of the line which BigO doesn't care about.

e.) The loop goes from 0 to the pow(2, n), which is 1 to 2^n, summarized as O(2^n)

Eric Leschinski
  • 146,994
  • 96
  • 417
  • 335
AsTeR
  • 7,247
  • 14
  • 60
  • 99
4

Assuming you don't count the cout as part of your Big-O measurement.

a) O(1) you can perform the integer assignment in constant time.
b) O(n) because it takes n operations for the loop.
c) O(n - c) = O(n) constants disappear in Big-O.
d.1) O(2*n) = O(n) two linear time algorithms end up being linear time.
d.2) If n grows with pow(2, n) = 2^n, then the number of operations are O(2^n); however if n is constant it would grow with O(k), where k = 2^6 = 64, which would be linear.

mevatron
  • 13,911
  • 4
  • 55
  • 72
1

These examples are fairly simple. First what you have to do is to determine the main (simple) operation in the code and try to express the number of invocations of this operation as a function of input.

To be less abstract:

a.)

This code always runs in a constant time. This time is dependant on the computer, I/O latency, etc. - but it is almost not dependant on the value of n.

b.)

This time a piece of code inside a loop is executed several times. If n is two times bigger, what can you say about the number of iterations?

c.)

Again some code inside a loop. But this time the number of iterations is less than n. But if n is sufficiently big, do you simmilarity to b.)?

d.)

This code is interesting. The operation inside a first loop is more sophisticated, but again it takes more-or-less constant amount of time. So how many times is it executed in relation to n? Once again compare with b.)

The second loop is there only to trick you. For small n it might actually take more time than the first one. However O(n) notation always takes high n values into account.

5.)

The last piece of code is actually pretty simple. The number of simple operations inside a loop is equal to n^2. Add 1 to n and you'll get twice as much operations.

Tomasz Nurkiewicz
  • 334,321
  • 69
  • 703
  • 674
0

a

n=6;
cout<<n<<endl;

Constant time, O(1). This means as n increases from 1 to infinity, the amount of time needed to execute this statement does not increase. Each time you increment n, the amount of time needed does not increase.

b

n=16;
for (i=0; i<n; i++)
    cout<<i<<endl;

Linear Time, O(n). This means that as n increases from 1 to infinity, the amount of time needed to execute this statement increases linearly. Each time you increment n, the amount of additional time needed from the previous remains constant.

c

i=6;
n=23;
while (i<n) {
    cout<<i-6<<endl;
    i++;
}

Linear Time, O(n), same as example 2 above.

d

int a[ ] = {1, 3, 5, 7, 9, 11, 13, 15, 17, 19};
n=10;
for (i=0; i<n; i++)
    a[i]=a[i]*2;
for (i=9; i>=0; i--)
    cout<<a[i]<<endl;

Linear time, O(n). As n increases from 1 to infinity, the amount of time needed to execute these statements increase linearly. The linear line is twice as steep as example 3, however Big O Notation does not concern itself with how steep the line is, it's only concerned with how the time requirements grow. The two loops require linearly growing amount of time as n increases.

e

sum=0;
n=6;
k=pow(2,n);
for (i=0;i<k;i++)
    sum=sum+k;

Create a graph of how many times sum=sum+k is executed given the value n:

n     number_of_times_in_loop
1     2^1 = 2
2     2^2 = 4
3     2^3 = 8
4     2^4 = 16
5     2^5 = 32
6     2^6 = 64

As n goes from 1 to infinity, notice how the number of times we are in the loop exponentially increases. 2->4->8->16->32->64. What would happen if I plugged in n of 150? The number of times we would be in the loop becomes astronomical.

This is Exponential time: O(2^n) (see here) denotes an algorithm whose growth will double with each additional element in the input data set. Plug in a large sized n at your own peril, you will be waiting hours or years for the calculation to complete for a handful of input items.

Why do we care?

As computer scientists, we are interested in properly understanding BigO notation because we want to be able to say things like this with authority and conviction:

"Jim's algorithm for calculating the distance between planets takes exponential time. If we want to do 20 objects it takes too much time, his code is crap because I can make one in linear time."

And better yet, if they don't like what they hear, you can prove it with Math.

Eric Leschinski
  • 146,994
  • 96
  • 417
  • 335
0

To understand the full mathematical definition I recommend Wikipedia. For simple purposes big-oh is an upper bound to algorithm, given a routine how many times does it iterate before finishing given a length of n. we call this upper bound O(n) or big oh of n.

In code accessing a member of a simple array in c++ is O(1). It is one operation regardless of how large the array is.

A linear iteration through an array in a for loop is O(n)

Nested for loops are O(n^2) or O(n^k) if have more than one nested for loop

Recursion with divide and conquer (heaps, binary trees etc) is O(lg n) or O(n lg n) depending on the operation.

lukecampbell
  • 14,728
  • 4
  • 34
  • 32