-2

we have known some of the algorithm’s asymptotic time complexity is a function of n such as

O(log* n), O(log n), O(log log n), O(n^c) with 0< c < 1, ....

May I know what is the smallest algorithm’s asymptotic time complexity as a function of n ?

  • Update 1 : we look for the asymptotic time complexity function with n. O(1) is the smallest, but it does not have n.
  • Update 2: O(1) is the smallest time complexity we can go, but what is the next smallest well-known functions with n ? so far as I research:

    O(alpha (n)) : inverse Ackermann: Amortized time per operation using a disjoint set

    or O(log * n)iterated logarithmic The find algorithm of Hopcroft and Ullman on a disjoint set

super1ha1
  • 629
  • 1
  • 10
  • 17
  • 3
    `smallest algorithm's`, what exactly does this mean? – vish4071 Oct 07 '15 at 10:50
  • O(1) -> constant time – Hungry Oct 07 '15 at 10:51
  • 2
    `O(1)` is the smallest possible category, for example an algorithm determining if `n` is odd or even. There can't be anything smaller than `O(1)` because it encompasses the case when you do nothing for every input. – biziclop Oct 07 '15 at 10:51
  • A lot of algorithms have O(1) complexity, like stack push-pop, returning 1st element from a list, queue push-pop etc. An infinitely ending list. – vish4071 Oct 07 '15 at 10:54
  • 2
    That's similar to asking "what is the smallest real number larger than 1". Can't be answered because there is always yet another number in between. – Henry Oct 07 '15 at 10:56
  • 1
    The smallest non-O(1) complexity that I know of in a practical algorithm is the [Disjoint-set data structure](https://en.wikipedia.org/wiki/Disjoint-set_data_structure), where the amortized time complexity is the inverse ackerman function. This function gives less than 5 even when `n` is in the trillions of trillions. – interjay Oct 07 '15 at 11:01
  • 1
    define "well-known function". – Henry Oct 07 '15 at 11:07
  • 1
    For what it's worth, the Ackermann function grows even more slowly than the iterated logarithm. – Sven Marnach Oct 07 '15 at 11:15
  • Actually O(1/n) = o(1) – Niklas B. Oct 07 '15 at 18:08

3 Answers3

5

Apart from the trivial O(1), the answer is: there isn't one.

If something isn't O(1) (that is, with n -> infinity, computing time goes to infinity), whatever bounding function of n you find, there's always a smaller one: just take a logarithm of the bounding function. You can do this infinitely, hence there's no smallest non-constant bounding function.

However in practice you should probably stop worrying when you reach the inverse Ackermann function :)

biziclop
  • 48,926
  • 12
  • 77
  • 104
2
  1. It is not necessary that the complexity of a given algorithm be expressible via well-known functions. Also note that big-oh is not the complexity of a given algorithm. It is an upper bound of the complexity.
  2. You can construct functions growing as slow as you want, for instance n1/k for any k.
  3. O(1) is as low as you can go in terms of complexity and strictly speaking 1 is a valid function, it simply is constant.

EDIT: a really slow growing function that I can think of is the iterated logarithm as complexity of disjoint set forest implemented with both path compression and union by rank.

Ivaylo Strandjev
  • 69,226
  • 18
  • 123
  • 176
  • yes, O(1) is the smallest we can go, but what is the next smallest well-known functions with n ? so far as I research, it is O(alpha (n) ) or O(log * n), but I am not sure is it correct – super1ha1 Oct 07 '15 at 11:04
  • @super1ha1 as I pointed out there is no such function. The lowest I can think of is the iterated logarithm of DSF which is practically linear – Ivaylo Strandjev Oct 07 '15 at 12:10
1

There is always a "smaller algorithm" that whatever suggested.

O(log log log log(n)) < O(log log log(n)) < O(log log (n)) < O(log(n)). 

You can put as many log as you want. But I don't know if there is real life example of these.

So my answer is you will get closer and closer to O(1).

smttsp
  • 4,011
  • 3
  • 33
  • 62