I was watching an online course on Big O and algorithms complexity and had doubts regarding one example:
private static int func(long n) {
int result = 0;
for(int i = 0; i <= n; i++) {
result = i*i;
}
return result;
}
Lecturer derived that it's complexity was O(n). I thought, since loop is O(n) but multiply is O(n^2), and i depends on n, it should have been O(n^2).
I then just wrote a sample Java app to check the actual execution time:
private static long N1 = 10;
private static long N2 = 100000;
public static void main(String[] args) {
long startTime1 = System.nanoTime();
System.out.println(func(N1));
long stopTime1 = System.nanoTime();
long difference1 = stopTime1 - startTime1;
long startTime2 = System.nanoTime();
System.out.println(func(N2));
long stopTime2 = System.nanoTime();
long difference2 = stopTime2 - startTime2;
System.out.println("N2 / N1 = " + N2 / N1);
System.out.println("difference2 / difference1 = " + difference2 / difference1);
}
Result was:
100
1410065408
N2 / N1 = 10000
difference2 / difference1 = 5
So if N2 is 10^4 times bigger, execution time is just 5 times bigger, i.e. it's a logarithm complexity. Why is that? Is it possible to derive this a priori, without actual testing?
UPDATE Thanks to everyone, I see that I missed or misunderstood a couple of things. I would accept each answer if I could. Thank you guys!