1

I understand that Big Omega defines the lower bound of s function (or best-case runtime).

Considering that almost every search algorithm could "luck out" and find the target element on the first iteration, would it be fair to say that its Big-Omega time complexity is O(1)?

I also understand that defining O(1) as the big Omega may not be useful -other lower bounds may be tighter, or closer to the evaluated function-, but the question is, is it correct?

I've found multiple sources claiming the linear search is Big-Omega O(n), even if some cases could complete in a single step, which is different from the best-case scenario as I understand it.

pfernandom
  • 929
  • 10
  • 22
  • It's fair, but it's a useless thing to say. The only functions that *aren't* Ω(1) are functions that *decrease* asymptotically. – chepner Feb 16 '23 at 21:17
  • O(1) is read "Big-oh of 1" or similar; it is a different thing altogether from big-omega notation. Every comparison-based sorting algorithm, for example, is Ω(n), because even in the best case (list is already sorted), you still have to look at each element in the list to determine that. – chepner Feb 16 '23 at 21:20
  • Linear search is Ω(1) in the best case (you find your element in the first position you look) and O(n) in the worst case (you find your element in the last position you look). – chepner Feb 16 '23 at 21:21
  • 2
    Notice that big-Ω indeed defines a lower bound, but this is not the same as best-case time complexity. You can describe best-case time complexity with big-O, Ω or θ equally correct. And you can describe the worst-case or average complexity using any of them too. – Berthur Feb 16 '23 at 21:58
  • 1
    @Berthur exactly what you just described is what my question is about: What is the difference between "lower-bound" and "best-case" in the context of algorithmic time complexity? – pfernandom Feb 17 '23 at 21:36
  • @pfernandom Glad I hit the target :) I attempted an answer to that below. Though I see that meanwhile, others also picked up this question and wrote some good insights. – Berthur Feb 17 '23 at 23:11

3 Answers3

1

The lower bound () is not the fastest answer a given algorithm can give.

The lower bound of a given problem is equal to the worst case scenario of the best algorithm that solves the problem. When doing complexity analysis, you should never forget that "luck" is always in the hands of the input (the instance the algorithm is trying to solve).

When trying to find a lower bound, you will imagine the "perfect algorithm" and you will try to "trap" it in a very hard case. Usually the algorithm is not defined and is only described based on its (hypotetical) performances. You would use arguments such as "If the ideal algorithm is that fast, it will not have this particular knowledge and will therefore fail on this particular instance, ie. the ideal algorithm doesn't exist". Replace ideal with the lower bound you are trying to prove.

For example, if we search the lower bound for the min-search problem in an unsorted array is (n). The proof for this is quite trivial, and like most of the time, is made by contradiction. Basically, an algorithm A in o(n) will not see at least one item from the input array, if that item it did not saw was the minimum, A will fail. The contradiction proves that the problem is in (n).

Maybe you can have a look at that answer I gave on a similar question.

cglacet
  • 8,873
  • 4
  • 45
  • 60
0

The notations O, o, Θ, Ω, and ω are used in characterizing mathematical functions; for example, f(n) = n3 log n is in O(n4) and in Ω(n3).

So, the question is what mathematical functions we apply them to.

The mathematical functions that we tend to be interested in are things like "the worst-case time complexity of such-and-such algorithm, as a function of the size of its input", or "the average-case space complexity of such-and-such procedure, as a function of the largest element in its input". (Note: when we just say "the complexity of such-and-such algorithm", that's usually shorthand for its worst-case time complexity, as a function of some characteristic of its input that's hopefully obvious in context. Either way, it's still a mathematical function.)

We can use any of these notations in characterizing those functions. In particular, it's fine to use Ω in characterizing the worst case or average case.

We can also use any of these functions in characterizing things like "the best-case […]" — that's unusual, but there are times when it may be relevant. But, notably, we're not limited to Ω for that; just as we can use Ω in characterizing the worst case, we can also use O in characterizing the best case. It's all about what characterizations we're interested in.

ruakh
  • 175,680
  • 26
  • 273
  • 307
0

You are confusing two different topics: Lower/upper bound, and worst-case/best-case time complexity.

The short answer to your question is: Yes, all search algorithms have a lower bound of Ω(1). Linear search (in the worst case, and on average) also has a lower bound of Ω(n), which is a stronger and more useful claim. The analogy is that 1 < π but also 3 < π, the latter being more useful. So in this sense, you are right.

However, your confusion seems to be between the notations for complexity classes (big-O, big-Ω, big-θ etc), and the concepts of best-case, worst-case, average case. The point is that the best case and the worst case time complexities of an algorithm are completely different functions, and you can use any of the notations above to describe any of them. (NB: Some claim that big-Ω automatically and exclusively describes best case time complexity and that big-O describes worst case, but this is a common misconception. They just describe complexity classes and you can use them with any mathematical functions.)

It is correct to say that the average time complexity linear search is Ω(n), because we are just talking about the function that describes its average time complexity. Its worst-case complexity is a different function, which happens not to be Ω(n), because as you say it can be constant-time.

Berthur
  • 4,300
  • 2
  • 14
  • 28