2

I was learning about algorithms and time complexity, and this quesiton just sprung into my mind.

Why do we only analyze an algorithm's time complexity?

My question is, shouldn't there be another metric for analyzing an algorithm? Say I have two algos A and B.

A takes 5s for 100 elements, B takes 1000s for 100 elements. But both have O(n) time.

So this means that the time for A and B both grow slower than cn grows for two separate constants c=c1 and c=c2. But in my very limited experience with algorithms, we've always ignored this constant term and just focused on the growth. But isn't it very important while choosing between my given example of A and B? Over here c1<<c2 so Algo A is much better than Algo B.

Or am I overthinking at an early stage and proper analysis will come later on? What is it called?

OR is my whole concept of time complexity wrong and in my given example both can't have O(n) time?

user1265125
  • 2,608
  • 8
  • 42
  • 65
  • 1
    possible duplicate of [Why are constants ignored in asymptotic analysis?](http://stackoverflow.com/questions/4150495/why-are-constants-ignored-in-asymptotic-analysis) – mbeckish Mar 22 '13 at 17:51
  • 1
    Nitpick: We don't just analyze the complexity in time, but also in space, comparisons made, swaps made, queries made to a relevant oracle, or whatever is of interest. But yeah, constants are usually ignored. –  Mar 22 '13 at 17:54

5 Answers5

4

We worry about the order of growth because it provides a useful abstraction to the behaviour of the algorithm as the input size goes to infinity.

The constants "hidden" by the O notation are important, but they're also difficult to calculate because they depend on factors such as:

  • the particular programming language that is being used to implement the algorithm
  • the specific compiler that is being used
  • the underlying CPU architecture

We can try to estimate these, but in general it's a lost cause unless we make some simplifying assumptions and work on some well defined model of computation, like the RAM model.

But then, we're back into the world of abstractions, precisely where we started!

abeln
  • 3,749
  • 2
  • 22
  • 31
  • 1
    +1: The constants are not just hidden by BigOh, but even Omega and Theta as well, which are also used when discussing time complexities. – Knoothe Mar 22 '13 at 18:05
2

We measure lots of other types of complexity.

But I guess you're talking more about a "don't the constants matter?" approach. Yes, they do. The reason it's useful to ignore the constants is that they keep changing. Different machines perform different operations at different speeds. You have to walk the line between useful in general and useful on your specific machine.

Craig Gidney
  • 17,763
  • 5
  • 68
  • 136
1

It's not always time. There's also space.

As for the asymptotic time cost/complexity, which O() gives you, if you have a lot of data, then, for example, O(n2)=n2 is going to be worse than O(n)=100*n for n>100. For smaller n you should prefer this O(n2).

And, obviously, O(n)=100*n is always worse than O(n)=10*n.

The details of your problem should contribute to your decision between several possible solutions (choices of algorithms) to it.

Alexey Frunze
  • 61,140
  • 12
  • 83
  • 180
0

A takes 5s for 100 elements, B takes 1000s for 100 elements. But both have O(n) time.

Why is that?
O(N) is an asymptotic measurement on the number of steps required to execute a program in relation to the programs input.
This means that for really large values of N the complexity of the algorithm is linear growth.
We don't compare X and Y seconds. We analyze how the algorithm behaves as the input goes larger and larger

Cratylus
  • 52,998
  • 69
  • 209
  • 339
0

O(n) gives you an idea how much slower the same algorithm will be for a different n, not for comparing algorithms. On the other hand there is also space complexity - how memory usage grows as a function of input n.