0

This is NOT homework, but practice problems from class. There is no solution key to it I know. I would like to see if I did it right. Also, how would I find the time complexity? Below are my solutions at the bottom of this.

Determine the Big Oh complexity of the following program fragments

1)

void sum1(int n){
sum1 = 0;
for(i=1; i<=n; i++)
for (j=1; j<=n; j++)
sum1++;
}

2)

void sum2(int n){
sum2 = 0;
for(i=1; i<=n; i++)
for (j=1; j<=i; j++)
sum2++;
}

3)

float useless(a){
n = a.length;
if (n==1){
    return a[0];
}
    // let a1, a2 be arrays of size n/2
for (i=0; i <= (n/2)-1; i++){
    a1[i] = a[i];
    a2[i] = a[n/2 +i];
}
for (i=0; i<=(n/2)-1; i++){
    for (j = i+1; j<=(n/2) -1; j++){
        if(a1[i] == a2[j])
    a2[j] = 0;
        }
    }
    b1 = useless(a1);
    b2 = useless(a2);
    return max(b1,b2);
}

4)

void sum3(int n) {
sum = 0;
for(i=0; i<sqrt(n)/2; i++)
sum++;
for(j=0; j<sqrt(n)/4; j++)
sum++;
for(k=0; k<8+j; k++)
sum++;
}

5)

void sum4(int n){
sum = 0;
for(i=0; i<sqrt(n)/2; i++)
for(j=1; j<8+i; j++)
for(k=j; k<8+j; k++)
sum++;
}

below is my solutions, the '?' means I'm not sure what to put in

1)

void sum1(int n){
sum1 = 0;             -->  1
for(i=1; i<=n; i++)   -->  n
for (j=1; j<=n; j++)  -->  n^(2)
sum1++;               -->  ?
}

Answer: O(n^(2))

2)

void sum2(int n){   
sum2 = 0;             --> 1
for(i=1; i<=n; i++)   --> n
for (j=1; j<=i; j++)  --> n^(2)
sum2++;               --> ?
}

Answer: O(n^(2))

3)

float useless(a){
n = a.length;               --> ?
if (n==1){
    return a[0];            --> ?
}
    // let a1, a2 be arrays of size n/2
for (i=0; i <= (n/2)-1; i++){         --> n
    a1[i] = a[i];                     --> ?
    a2[i] = a[n/2 +i];                --> ?
}
for (i=0; i<=(n/2)-1; i++){          ---> n+n = n
    for (j = i+1; j<=(n/2) -1; j++){  --->  n^(2)
        if(a1[i] == a2[j])    
    a2[j] = 0;                        --> 1
        }
    }
    b1 = useless(a1);               --> ?
    b2 = useless(a2);               --> ?
    return max(b1,b2);              --> ?
}

I am very stuck on this one...

4)

void sum3(int n) {
sum = 0;                    --> 1
for(i=0; i<sqrt(n)/2; i++)  --> n
sum++;                      --> ?
for(j=0; j<sqrt(n)/4; j++)  --> n^(2)
sum++;                      --> ?
for(k=0; k<8+j; k++)        --> n^(3)                      
sum++;                      --> ?
}

Answer: O(n^(3))

5)

void sum4(int n){                
sum = 0;                    --> 1
for(i=0; i<sqrt(n)/2; i++)  --> n
for(j=1; j<8+i; j++)        --> n^(2)
for(k=j; k<8+j; k++)        --> n^(3)
sum++;                      --> ?
}

Answer: O(n^(3))

Forcatos
  • 23
  • 1
  • 4
  • Perhaps, one of [these](http://stackoverflow.com/questions/11032015/how-to-find-time-complexity-of-an-algorithm) answers can help you – Gomiero Feb 04 '16 at 16:08

2 Answers2

1

Let's consider a problem you already solved:

void sum2(int n){   
    sum2 = 0;
    for(i=1; i<=n; i++)
        for (j=1; j<=i; j++)
            sum2++;
}

we're counting how many increments are done on sum2: sum2++ is 1 increment. Thus we get the following terms:

sum i from 1 to n of [ sum j from 1 to i of (1) ]
= sum i from 1 to n of (i)
= n(n-1)/2
in O(n^2)

Let's consider 3)

float useless(a){
    n = a.length;

    // B: breaking condition
    if (n==1) return a[0];

    // L1: first loop
    // let a1, a2 be arrays of size n/2
    for (i=0; i <= (n/2)-1; i++) {
        a1[i] = a[i];
        a2[i] = a[n/2 +i];
    }

    // L2: second loop
    for (i=0; i<=(n/2)-1; i++) {
        for (j = i+1; j<=(n/2) -1; j++) {
            if(a1[i] == a2[j])
                a2[j] = 0;
        }
    }

    // R: recursion
    b1 = useless(a1);
    b2 = useless(a2);

    return max(b1, b2);
}

we're counting the number of assignments to aX: a[j] = 0; is 1 assignment, so is a1[i] = a[i];. (Note that we could as easy be counting the number of array accesses aX[I]). Let's call this number A(n) where n is the length of the input array a. We note two destinct cases for A:

A(1) = 0
A(n) = ...           n > 1

Let's see what costs we have for the general case before considering the recursion (comment R):

B executes 0 assignments
L1 executes 2*n/2 == n assignments
L2 executes at worst (if all elements are 0) n*(n-1)/2 assignments

thus A executes n(n+1)/2 assignments total before reaching R

Then follow two recursions of half the input size - so for A we now get:

A(1) = 0
A(n) = n(n+1)/2 + 2 * A(n/2)   n > 1
     = n(n+1)/2 + 2 * ((n/2)((n/2)+1)/2 + 2 * A(n/4))
     = n(n+1)/2 + 2 * ((n/2)((n/2)+1)/2 + 2 * ((n/4)((n/4)+1)/2 + 2 * A(n/8)))
     = ...
     = n(n+1)/2 + sum i from 1 to x of ( 2^i * (n/(2^i))((n/(2^i))+1)/2 )
     = sum i from 0 to x of ( 2^i * (n/(2^i))((n/(2^i))+1)/2 )

The recursion happens x times, that is until the breaking condition B is reached and n == 1. For n := 2^k this happens after exactly k steps - in general we use x := log2(n) as n goes to infinity. From here it's algebra to get to a closed form solution.

Note that there are some tricks to make this easier still: since we're not interested in counting the exact number of assignments, we can simplify n(n+1)/2 to n^2 as they are both in O(n^2). You can solve for the exact number or for the approximate A(n) = n^2 + 2 * A(n/2), they are both in the same complexity class.

BeyelerStudios
  • 4,243
  • 19
  • 38
1

@BeyelerStudios has done a great job so far; I'll try not to repeat.

Some general principles that I think you already have (for the most part):

  • For normal, linear statements, you need a priori knowledge of the statement's complexity. Basic arithmetic operations and array accesses are O(1).
  • The order of a sequence of statements is this highest order of any statement in the sequence.
  • The order of a decision branch is the highest order of any of the branches.
  • The order of a loop is the repetition of the loop times the order of the loop body.
  • The order of a recursive call is the repetition of the recursion times the order of the function body.

Now for problem #4:

This is not O(n^3): note that the loops are not nested. Each consists of a single increment statement, O(1). The first two loops are O(sqrt(n)); the third uses the final value of j plus a constant, so that is also O(sqrt(n)). Note especially that O(sqrt(n)) < O(n), which you used. The square root is n^(1/2).

Problem #5:

Each loop is O(sqrt(n)), but they are now nested. This gives you sqrt(n)^3, or n^(3/2).

One final detail to note in the future is that sqrt(n) is not necessarily O(1); it depends on the implementation. Modern chips have a sqrt function on board that works on all native integers and floats in O(n) time. However, there are some data types of arbitrarily large floats and integers whose implementation is O(log(n)). If your teacher hasn't made a summary decision on that operation, look out for sqrt -- and other transcendental functions -- sneaking into a loop body.

Prune
  • 76,765
  • 14
  • 60
  • 81