1

Is the execution time of this unique string function reduced from the naive O(n^2) approach?

This question has a lot of interesting discussion leads me to wonder if we put some threshold on the algorithm, would it change the Big-O running time complexity? For example:

void someAlgorithm(n) {
    if (n < SOME_THRESHOLD) {
         // do O(n^2) algorithm
    }
}

Would it be O(n2) or would it be O(1).

Community
  • 1
  • 1
invisal
  • 11,075
  • 4
  • 33
  • 54
  • 2
    Is there an `else` condition to your code snippet? If you don't execute anything above a hard-coded finite value, then the running time will be bounded by a constant. – Tim Biegeleisen May 21 '15 at 03:28

2 Answers2

13

This would be O(1), because there's a constant, such that no matter how big the input is, your algorithm will finish under a time that is smaller than that constant.

Technically, it is also O(n^2), because there's a constant c such that no matter how big your input is, your algorithm will finish under c * n ^ 2 time units. Since big-O gives you the upper bound, everything that is O(1) is also O(n^2)

Ishamael
  • 12,583
  • 4
  • 34
  • 52
0

If SOME_THRESHOLD is constant, then you've hard coded a constant upper bound on the growth of the function (and f(x) = O (g(x)) gives an upper bound of g(x) on the growth of f(x)).

By convention, O(k) for some constant k is just O(1) because we don't care about constant factors.

Note that the lower bound is unknown, a least theoretically, because we don't know anything about the lower bound of the O(n^2) function. We know that for f(x) = Omega(h(x)), h(x) <= 1 because f(x) = O(1). Less than constant-time functions are possible in theory, although in practice h(x) = 1, so f(x) = Omega(1).

What all this means is by forcing a constant upper bound on the function, the function now has a tight bound: f(x) = Theta(1).

Community
  • 1
  • 1
Anthony
  • 12,177
  • 9
  • 69
  • 105