0

I'm trying to grasp the idea of Big O for part of a project that is due tonight and I don't know if i'm thinking this through right or not...

The project included us writing Iterative and Recursive solutions to math operations.

Here are some of the methods.

public int inc(int n)
{
    n = n + 1;
    return n;
}
public int dec(int n)
{
    n = n - 1;
    return n;
}

public int add(int lhs, int rhs)
{
   int sum = lhs;    
   while (rhs > 0) {
        sum = inc(sum);
        rhs = dec(rhs);
    } // while
    return sum;
}


public int fac(int num)
{
    int ans = num;
    while(num > 1)
        {
            ans = mul(ans, dec(num));
        }
    return ans;
}



public int div(int lhs, int rhs) throws ArithmeticException
{
    int quot = 0;
    while (lhs > 0)
        {
            lhs = sub(lhs,rhs);
    if(lhs < rhs)
                {
        quot = 0;
                }   
            else
                {
                    quot = inc(quot);
              }
        }
    return quot;
}


public int lshift(int lhs, int rhs)
{
    return mul(lhs,pow(2,rhs));       
}

Would all these be O(n)? Or just inc,dec, and add?

For lines where other methods are called (like mul - multiply - in fac and lshift), is this constant time or is it n again, making O(n^2)? Could someone explain?

And how would I go about dealing with recursive methods?

public int  mul(int lhs, int rhs)
{
    if (rhs == 0 || lhs == 0) return 0;
    else if(rhs == 1) return lhs;
    return add(lhs, mul(lhs, dec(rhs)));
}

I'll just leave it with 1 recursive example unless anyone asks for others. The if and else if be considered constant time each, correct? Then the return is calling all sorts of other recursive methods, as well as itself (obviously), and I really have no clue where to start.

Could someone try to explain this really simply?

EDIT: Added Pow

public int pow(int lhs, int rhs)
{
    int ans = lhs;
    for(int i = rhs; i > 1; i--)
    {
    ans = mul(ans,lhs);
    }
    return ans;
}

2 Answers2

1

The idea behind Big-Oh is that it's the average runtime of a particular algorithm. When analyzing runtime, you're looking for key things, such as:

  • The relation of the parameter being used to the operations being done with it
  • The relation of the "critical part" of the code to the parameter being passed in

There are a few other things you'll want to look for, such as best- and worst-case behavior.

Now, on to your methods.

  • Both inc and dec are of constant time. They take no longer to execute dependent on the size of the parameter passed in.

  • add is bound to the size of rhs, since you increment by one step for every value in rhs. So that runtime would be O(n).*

  • mul, per your recursive example, has three cases: two base cases and an iterative case. The base cases are presumed to be run in constant time, but since there is an iterative case, it outweighs the runtime of the base cases.

    In that instance, you're bound by the size of rhs passed in, so it will run in O(n) time.

  • fac is bound to the size of num passed in. It will be O(n) runtime.

  • div is bound to lhs, and it will be in O(n) runtime.

*: Talk about a strange way to add two numbers...

Makoto
  • 104,088
  • 27
  • 192
  • 230
  • 1
    The `mul` method in fact has a **best case runtimes** of O(1) and a **worst case runtime** of O(n). Understanding the concept that the runtime complexity can vary depending on the data will be important when comparing sorting algorithms (usually the next thing students do when learning about algorithm complexity) – Philipp Apr 15 '13 at 00:24
  • (Yes, very strange way to add!!) And so to make sure I understand this... so calling other methods is always considered constant time? So would my lshift method be considered constant time as well? – user2278644 Apr 15 '13 at 00:25
  • ***Not necessarily.*** It really depends on what the method is doing. Just calling it doesn't mean it's constant time. Examine base cases, iterative cases, and critical part operations very closely. Specifically I can't tell you, since I don't know how `pow` will behave, but it's bound to `rhs`. – Makoto Apr 15 '13 at 00:26
  • Alright... So could you explain the lshift method then? Would I have to ever take into account the Big O of the methods i'm calling from within another method? (You are being extremely helpful btw!) Edit: Added pow – user2278644 Apr 15 '13 at 00:31
  • Yes, you would have to take their behavior into account, but only when you're adding more work. From looking at `pow`, for every call to `mul`, you'll be performing `lhs` operations in `pow`. So the runtime for that is likely O(n^2). – Makoto Apr 15 '13 at 00:36
  • So since its calling mul and then inside of that its calling pow (which calls mul), its like having nested for loops, and so its O(n^2)... am I thinking of that right. And so would then Pow be O(n^2) as well? – user2278644 Apr 15 '13 at 00:44
  • You tell me. From what I've told you above, look for the critical part of `pow` and see which variable it's bound to. Then, observe what operations it's doing with relation to that bound. Lastly, observe the operations it's doing when it calls `mul`. – Makoto Apr 15 '13 at 00:50
0

inc and dec are O(1), since they can always be done in a constant number of operations, and they don't depend on the value you pass as argument.

add, for instance, depend on the number you pass as argument, so if we consider this number to be our "order", add is O(n).

Simply put: the higher the value you pass as argument, the more operations you'll need to do. Since this increase is linear (roughly, if you increase by 100, you'll have an increase in operations of the same order of magnitude).

Your mul implementation will also run in O(n) time, since this is the cost of the more "expensive" operation you have, add.

What you need to understand is the meaning of Big-O notation: the relationship between the running time of your algorithm and the variation on the size of the inputs.

If you keep this in mind, it'll be much easier for you to figure out the cost of your methods.

pcalcao
  • 15,789
  • 1
  • 44
  • 64