1

I have searched all over stack overflow and couldn't find a concise answer for this. Consider this code:

n = [ [1, 2, 3], [4, 5, 6], [7, 8, 9], [10, 11, 12] ] # O(1)
for i in n: # O(n)
    for j in i: # O(n)
        print(j)

If my understanding of O() is correct, the time complexity for this algorithm is 1+n*n, and if we throw away the insignificant 1+, it becomes n^2, or quadratic time. Now consider this code:

n = input() # O(1)
for i in range(0, len(n), 2): # O(n/2)?
    print(n[i])

This time we use division instead of an exponent: 1+n/2 -> n/2.

But everywhere i searched, people said O(n/2) == O(n), because O() measures relative time, etc. But if O(n/2) is absolute, and we should collapse it into O(n) when using big O, doesn't that make types of time complexity like log n and n! incorrect too, because it would just mean modifying n with absolute values...?

n^2 == n because it's the same as n/2, but instead of / we use ^. n! isn't correct because factorial is just a series of multiplication, so n! is the same as n*(n-1)*... which, i believe, would collapse into something completely different after throwing away all the -1s. Where am i wrong?

P.S. My understanding of logarithms and big O notation is very incomplete, please don't hate me if i'm missing something obvious.

Nickaos
  • 163
  • 9
  • 4
    Because `O(n/2)` == `O(n/1)` but `O(n^2) != O(n^1)` –  Feb 15 '22 at 14:27
  • 1
    The question is really "why do people say O(n) but not O(n/2)". The answer is typically people use the simplest representative of a big-O class. – Paul Hankin Feb 15 '22 at 14:32
  • 4
    `O(f(n)) = O(c * f(n))` if c is a constant. And `1/2` is a constant, `n` isn't. – user2390182 Feb 15 '22 at 14:33
  • 2
    An algorithm of `O(n)` time complexity is such that if n is doubled, then the algorithm takes twice as long. An algorithm of `O(n^2)` time complexity is such that if n is doubled, then the algorithm takes *four* times as long. So "`n^2` == `n`" is very wrong. – jjramsey Feb 15 '22 at 14:34
  • 2
    Think of it as being proportional to, and so you can ignore the constant. O(4n) is proportional to n, and so is is O(n/2) where the constant is 1/2. – jarmod Feb 15 '22 at 14:37
  • *Additive* and *multiplicative* constants don't matter. You can simplify anything of the form `O(mn+b)` to `O(n)` when `m` and `b` are constants. Other constants do matter, e.g. exponents. – John Kugelman Feb 15 '22 at 15:10
  • 2
    The growth of O(n) and O(n/2) are both straight lines. The growth of O(n^2) is a curve. – stark Feb 15 '22 at 16:36

1 Answers1

3

Have a look at the Wikipedia page on big O

In typical usage the O notation is asymptotical, that is, it refers to very large x. In this setting, the contribution of the terms that grow "most quickly" will eventually make the other ones irrelevant. As a result, the following simplification rules can be applied:

  • If f(x) is a sum of several terms, if there is one with largest growth rate, it can be kept, and all others omitted.
  • If f(x) is a product of several factors, any constants (terms in the product that do not depend on x) can be omitted.
Jacques Gaudin
  • 15,779
  • 10
  • 54
  • 75