I want to know, given a computer and the big O of the running time of an algorithm, to know the actual aproximated time that the algorithm will take in that computer.
For example, let's say that I have an algorithm of complexity O(n) and a computer with one processor of 3.00 GHz, one core, 32-bits and 4 GB RAM. How can I estimate the actual seconds that will take this algorithm.
Thanks