Let's consider the following function:
int foo(int n) {
int x = 0;
for(int i = 0; i < n; i++) {
for(int j = 0; j < n; j++) {
x -= 1;
}
}
x = 0;
for(int i = 0; i < n; i++) {
x += 1;
}
return x;
}
According to Big O notation, run-time complexity of this function will be (I will be very precise):
O(1 + N^2 + 1 + N) = O(N^2)
Dependency between N
and algorithm run-time upper bound is equals to:
1 + N^2 + 1 + N
According to this article, Asymptotic Analysis let us drop the constants and non-dominant terms. So the result dependency will be:
1 + N^2 + 1 + N = N^2
Which is the same expression as Big O notation.
But according to this lecture Asymptotic Analysis doesn't allows us to drop the constants neither non-dominant terms, so if I want to evaluate this expression using Asymptotic Analysis I will get:
1 + N^2 + 1 + N
I'm very confused because until this moment I was totally sure that Asymptotic Analysis and Big O are the same thing. So I and have two questions:
- What the difference between Big O notation and Asymptotic analysis?
- Which of two articles is lying?