-1

I want to know, given a computer and the big O of the running time of an algorithm, to know the actual aproximated time that the algorithm will take in that computer.

For example, let's say that I have an algorithm of complexity O(n) and a computer with one processor of 3.00 GHz, one core, 32-bits and 4 GB RAM. How can I estimate the actual seconds that will take this algorithm.

Thanks

Silkking
  • 250
  • 3
  • 15
  • 1
    You'd obviously need to know the constant terms ignored in the big O, and how the algorithm would compile down to machine cycles, where any memory or IO bottlenecks would be etc. so the O itself isn't enough. Your best bet is to implement and measure. – Rup Jul 30 '18 at 16:38
  • This should answer your question: [What is a plain English explanation of "Big O" notation?](https://stackoverflow.com/q/487258) (you can't, basically) – Bernhard Barker Jul 30 '18 at 19:21

1 Answers1

3

There is no good answer to this question, simply because big O notation was never meant to answer that sort of question.

What big O notation does tell us is the following:

Big O notation characterizes functions according to their growth rates: different functions with the same growth rate may be represented using the same O notation.

Note that this means that functions that could hold very different values can be assigned the same big O value.

In other words, big O notation doesn't tell you much as to how fast an algorithm runs on a particular input, but rather compares the run times on inputs as their sizes approach infinity.

Rushabh Mehta
  • 1,529
  • 1
  • 13
  • 29