I understand that classifying in Big-O is essentially having an "upper-bound" of sort so I understand it graphically, I don't understand is how to use the Big-O formal definition to solve this types of problems. Any help is appreciated.
-
Maybe this is already answered in https://stackoverflow.com/questions/3255/big-o-how-do-you-calculate-approximate-it?rq=1, https://stackoverflow.com/questions/487258/what-is-a-plain-english-explanation-of-big-o-notation?rq=1, or https://stackoverflow.com/questions/2754718/big-oh-notation-formal-definition?rq=1? – Suma Mar 11 '21 at 07:26
-
1for this specific problem, i dont think so – Aeden Schmidt Mar 11 '21 at 07:30
1 Answers
Although there are platforms better suited for this question than SO, since this gets purely mathematical, understanding this is fundamental for using big-O also in a Computer Science context, so I like the question. And understanding your particular example will likely shed some light on what big-O is in general, as well as provide some practical intuition. So I will try to explain:
We have a function f(n) = n2. To show that it is not in O(1), we have to show that it grows faster than a function g(n) = c where c is some constant. In other words, that f(n) > g(n) for sufficiently large n.
What does sufficiently large mean? It means that for any constant c, we can find an N so that f(n) > g(n) for all n > N.
This is how we define asymptotic behaviour. Your constant function may be larger than f(n), but as n grows enough, you will eventually reach a point where f(n) remains larger forever. How to prove it?
Whatever constant c you choose - however large - we can construct our N. Let us say N = √c. Since c is a constant, so is √c.
Now, we have f(n) = n2 > c = g(n) whenever n > √c.
Therefore, f(n) is not in O(1). □
The key here is that we found an constant N (that is, which depends on our constant c but not on our variable n) such that this inequivalence holds. It can take some wrapping one's head around it, but doing a few other simple examples can help get the intuition.

- 4,300
- 2
- 14
- 28
-
may i ask why is g(n) is equal to c, where c is some constant? im a little confused as the formal definition states that "f(n) is always greater than or equal to zero and is always less than or equal to cg(n)." – Aeden Schmidt Mar 12 '21 at 03:12
-
1You are comparing with the class of constant functions *O(1)*. The class of constant functions can be defined as any function *g(n) = c*, where *c* is a constant. The confusion here is that you can choose *c* as large as you want, but the point is that it doesn't depend on *n*. This means that however large *c* is, an *O(n)* function will always reach larger values eventually, as *n* grows. – Berthur Mar 12 '21 at 12:12