1

I was watching an online course on Big O and algorithms complexity and had doubts regarding one example:

private static int func(long n) {

    int result = 0;

    for(int i = 0; i <= n; i++) {
        result = i*i;
    }

    return result;
}

Lecturer derived that it's complexity was O(n). I thought, since loop is O(n) but multiply is O(n^2), and i depends on n, it should have been O(n^2).

I then just wrote a sample Java app to check the actual execution time:

    private static long N1 = 10;
    private static long N2 = 100000;

    public static void main(String[] args) {

        long startTime1 = System.nanoTime();
        System.out.println(func(N1));
        long stopTime1 = System.nanoTime();
        long difference1 = stopTime1 - startTime1;

        long startTime2 = System.nanoTime();
        System.out.println(func(N2));
        long stopTime2 = System.nanoTime();
        long difference2 = stopTime2 - startTime2;

        System.out.println("N2 / N1 = " + N2 / N1);
        System.out.println("difference2 / difference1 = " + difference2 / difference1);
    }

Result was:

100
1410065408
N2 / N1 = 10000
difference2 / difference1 = 5

So if N2 is 10^4 times bigger, execution time is just 5 times bigger, i.e. it's a logarithm complexity. Why is that? Is it possible to derive this a priori, without actual testing?

UPDATE Thanks to everyone, I see that I missed or misunderstood a couple of things. I would accept each answer if I could. Thank you guys!

Roman
  • 2,079
  • 4
  • 35
  • 53
  • 4
    Your benchmark is wrong because it does not include the hotspot warmup. See http://stackoverflow.com/questions/504103/how-do-i-write-a-correct-micro-benchmark-in-java – lbalazscs Apr 04 '15 at 21:21
  • 4
    The complexity of multiplication is theoretically dependent on the number of *bits* in the operands, which is a constant (32) in your program (though `i` should have been `long` which would give `64`. Still a constant). So the only operation that varies on input is the loop. – RealSkeptic Apr 04 '15 at 21:29

6 Answers6

4

You count the time complexity by finding the most common instruction(dominant instruction) or one of them.

In your case this is: i <= n or these which are performed just 1 less time: result = i*i;or i++. While assesing complexity -1 or +1 is not important so pick whichever you want.

Then you try to formulate number of executions of dominant instruction via n variable. In your example i<=n is performed exactly n+2 times thus complexity is O(n+2) = O(n) because you should not care about constants while counting complexity.

Final note: Multiply is not O(n^2), you might have confused it with multiplying BigIntegers not plain ints.

Yoda
  • 17,363
  • 67
  • 204
  • 344
3

Try to rerun you measurements first. Don't count in the time it takes to print out the value, to get rid of some inaccuracies.

long startTime1 = System.nanoTime();
int res = func(N1);
res = func(N1);
...
res = func(N1);
long stopTime1 = System.nanoTime();
System.out.println(res);

Secondly you should run the calculation like 8 times while measuring the time, so it runs longer to even more raise the accuracy. I prefer to run a calculation for like 30 seconds or more. And ensure that the input n is large enough. Even then run whole test at least three times and average the results.

EDIT: RealSkeptic is right about the hotspot warmup, he gave a nice link to an explanation

Next multiplication can run in few clock cycles on modern CPU's. From this point of view the multiplication is just a single operation. The big O notation can keep you focused on the general sense of speed of your algorithm so you can choose algorithm with less complexity first. After that you can do more code and maybe hardware specific optimalisation.

Community
  • 1
  • 1
nio
  • 5,141
  • 2
  • 24
  • 35
3

Complexity is O(n) because your function does n operations, it doesn't matter what kind of operations you do.

In this case as you pass larger n's to the func the number of operations will grow linearly and that is what O(n) complexity is. It would have been O(n^2) if you had nested loops going 1 to n.

Benchmarks don't matter as they are implementation specific and depend on other factors. To get more accurate result save the values of func(N1) and func(N2) into variables and print out these variables in the very end.

m0s
  • 4,250
  • 9
  • 41
  • 64
1

Big O is asymptotic time, not actual time. We are not counting operations, we are showing the relationship between number of inputs and overall running time.

While it's right to measure, you haven't made enough measurements to come to a useful conclusion. Try thousands of different values of n, not just two, and graph the running times you get. In the graph, you will find outliers, but ultimately you'll see a straight line (linear) relationship between n and execution time, therefore O(n). If the graph curved up, it could be O(log n), O(n log n), O(n^2), O(n!), etc.

Stuart Caie
  • 2,803
  • 14
  • 15
0

No because,

result = i*i;

Even though it might take a long time, as i*i can go very large set of bit operations for long numbers. This is excluded because, this is considered as a single machine instruction and it is considered to take a constant amount of time. Remember that the time complexity doesn't tell you exactly within how much time your code executes. What we need to look is just for the function that tells how many times this instruction executes and map it on a graph. It also doesn't care if your graph looks like a linear graph from origin or from y=2. What it cares is it is linear. This is the reason why time complexity remains O(n).

0

Just a side-note:
Even if multiplication was relevant here, the important point would be how the multiplication is performed. For primitive types it's implemented in the hardware, thus we can consider it as constant O(1). But even ignoring that fact, CPUs don't implement multiplication using the paper-and-pencil-method you learned in school, but rather russian-peasant algorithm (aka ancient-egyptian multiplication, etc.), which would run in O(log max(n , m)) for n * m.