What does people mean when they say things like:
Big O estimates scalability.
The runtime grows “on the order of the square of the size of the input”, given that the algorithm in the worst case runs in
O(n^2)
.
Does it mean for large n
quadruple runtime for the doubled input size (assuming the algorithm runs in O(n^2)
)?
Is you said YES, then suppose that the number of steps our algorithm takes in the worst case is expressed by the function:
It follows that:
Moreover, we can see that:
But we can't quadruple runtime for the doubled input size n
(even for large values of n
) in this case, because it would simply be wrong. To see this, check the following plot:
And here is the plot of the ratio f(2n)/f(n)
:
It doesn't look like this ratio is tending towards 4.
So, what is scalabity, then? What does Big O estimate, if not the ratios like f(2n)/f(n)
for large enough n
? It doesn't seem to me that f
scales like n^2
.