0

Let's consider the following function:

int foo(int n) {
  int x = 0;
  for(int i = 0; i < n; i++) {
    for(int j = 0; j < n; j++) {
        x -= 1;
    }
  }

  x = 0;

  for(int i = 0; i < n; i++) {
    x += 1;
  }
  return x;
}

According to Big O notation, run-time complexity of this function will be (I will be very precise):

O(1 + N^2 + 1 + N) = O(N^2)

Dependency between N and algorithm run-time upper bound is equals to:

1 + N^2 + 1 + N

According to this article, Asymptotic Analysis let us drop the constants and non-dominant terms. So the result dependency will be:

1 + N^2 + 1 + N = N^2

Which is the same expression as Big O notation.

But according to this lecture Asymptotic Analysis doesn't allows us to drop the constants neither non-dominant terms, so if I want to evaluate this expression using Asymptotic Analysis I will get:

1 + N^2 + 1 + N 

I'm very confused because until this moment I was totally sure that Asymptotic Analysis and Big O are the same thing. So I and have two questions:

  1. What the difference between Big O notation and Asymptotic analysis?
  2. Which of two articles is lying?
No Name QA
  • 724
  • 1
  • 6
  • 20
  • In the lecture where it says "Asymptotic Analysis doesn't allows us to drop the constants neither non-dominant terms". It even have a clear example for cubic algorithm with squares and constants in it the notation is O(n^3) – Ashraff Ali Wahab Sep 20 '18 at 20:08
  • Asymptotic analysis is analyzing some limiting behavior of function(let say as x approaches infinity or approahes zero or antoher) where is Big O is one special case finding upper-bound of a function. – The Scientific Method Sep 20 '18 at 20:10
  • @AshraffAliWahab Quote from article: `For example, we say the standard insertion sort takes time T(n) where T(n)= c*n2+k for some constants c and k. In contrast, merge sort takes time T '(n) = c'*n*log2(n) + k'.` Author doesn't drop constants. – No Name QA Sep 20 '18 at 20:11
  • @TheScientificMethod Thank you! But why in many articles I can find something like, quote `Big O describes Asymptotic growth rate`. Does it meant that Big O is simplified version of Asymptotic Analysis? – No Name QA Sep 20 '18 at 20:14
  • I think you missed the next line "The asymptotic behavior of a function f(n) (such as f(n)=c*n or f(n)=c*n2, etc.) refers to the growth of f(n) as n gets large. We typically ignore small values of n, since we are usually interested in estimating how slow the program will be on large inputs. " . So the author never said we should not ignore. – Ashraff Ali Wahab Sep 20 '18 at 20:15
  • @NoNameQA That is the *time* it actually takes, hence the function is named `T`. When you then *apply* [asymptotic analysis](https://en.wikipedia.org/wiki/Asymptotic_analysis) to `T`, you get `T(n) ~ n^2`, aka _O(n^2)_. Asymptotic analysis applies to a function, so you need to determine that function first, before you can analyze it. Said another way, asymptotic analysis is the process of determining Big-O. – Andreas Sep 20 '18 at 20:15
  • @Andreas `Said another way, asymptotic analysis is the process of determining Big-O` Don't get it. Could you explain it in more details? – No Name QA Sep 20 '18 at 20:20
  • @NoNameQA yes one type of Asymptotic analysis. – The Scientific Method Sep 20 '18 at 20:24
  • @TheScientificMethod One last question: Does it also mean that `worst-case run-time` and `upper bound run-time` are synonyms? – No Name QA Sep 20 '18 at 20:39
  • yes. you are right. – The Scientific Method Sep 20 '18 at 20:41
  • @NoNameQA this may help also https://stackoverflow.com/questions/3255/big-o-how-do-you-calculate-approximate-it – The Scientific Method Sep 20 '18 at 20:44
  • @TheScientificMethod Sorry for being annoying, but let me ask one more question. According to definition, `algorithm run-time complexity is the amount of time required by the algorithm to complete`. If it's true, it means that `worst-case run-time`, `upper bound runtime`, `run-time complexity` and `order of growth` are the same things. And if this is true too, why we need so much names for the same thing? – No Name QA Sep 20 '18 at 20:53

1 Answers1

0

What the difference between Big O notation and Asymptotic analysis?

Asymptotic analysis is a method for us to analyse whether one algorithm is "faster" than another if its runtime scales better with the size of the input. And it just fits for large input, but thereof it can abstract away external factors that may affect the speed of an algorithm such as hardware, language and other issues.

For instance I can sort a list by hand quicker than a supercomputer if I use a "faster" algorithm and the list is long enough.

But how can we compare the performance of two algorithms? There are three measures: Big O, Big Omega and Big Theta. For detailed explanation please refer to this answer.

Which of two articles is lying?

I think you have got the answer in the comments.

Lerner Zhang
  • 6,184
  • 2
  • 49
  • 66