-1

Shouldn't all recursive algorithm have worst case (Space) of O(inf) due to potential overflow?

Take Fibonacci algorithm What is the space complexity of a recursive fibonacci algorithm?

The answer is O(N) but what if the input is large and stack overflow occurs. Then the worst case is O(inf)

Community
  • 1
  • 1
Zanko
  • 4,298
  • 4
  • 31
  • 54
  • Worst case space complexity deals with theory, not `int` limits. – Eli Sadoff Oct 27 '16 at 19:14
  • If the input is large, then n is large and hence O(n) is large. For a given implementation, O(n) could be too large to support, but that is a separate issue from how the algorithm works. – Gordon Linoff Oct 27 '16 at 19:16
  • 1
    Big O is theoretical. In theoretical computer science, it's assumed that available resources are unlimited. – 4castle Oct 27 '16 at 19:18
  • In practice, the run time of any program is bounded by the life time of the universe-therefore the worst case complexity is O(1). But this hardly tells you anything useful. – Joni Oct 27 '16 at 19:23
  • Note that some recursive algorithms can be subjected to [tail-call optimization](http://stackoverflow.com/q/310974/1115360). – Andrew Morton Oct 27 '16 at 19:31
  • Sure, but that sort of silly worst-case analysis could apply to any algorithm. If your implementation of Euclid's algorithm for calculating the gcd of two numbers crashes when fed extremely large inputs on a system of limited resources, you don't say the algorithm or its implementation is O(infinity). The algorithm is still linear; the implementation on a real system can't handle that specific instance. – dr jimbob Oct 27 '16 at 19:32
  • 2
    I'm voting to close this question as off-topic because this belongs on cs.stackexchange.com – dr jimbob Oct 27 '16 at 19:37

1 Answers1

0

Short answer: No.

Slightly longer answer: As everyone mentioned in the comments, Big-O notation assumes infinite resources, so issues of implementation (in particular hardware related ones) have no bearing. It's only a description of the algorithm and how it hypothetically would scale if you give it arbitrarily large inputs.

That said, if you WERE to include hardware constraints in the definition, the answer would still be no, since Big-O is concerned with processor cycles, or iterations, or tasks -- that sort of measurement of doing something. If you get too deep in a recursive algorithm such that you run out of stack, you're gonna stop performing any algorithm-related operations whatsoever. It'll just stop and you won't be "doing anything" anymore. Nothing more to measure, and certainly not infinite.

And anyway, the point of Big-O is to be useful and help with understanding the theory of the algorithm. Saying "everything is Big-O of infinity", even if it were right, wouldn't accomplish any of that. It'd just be a loophole in the definition.

B. Eckles
  • 1,626
  • 2
  • 15
  • 27