3

This was during a Data Structures and Algorithms lesson that I encountered this doubt. I have looked up several resources,both online and offline, but still have doubts. I was asked to figure out the running time of an algorithm and I did that correctly - O(n^3) . But what puzzled me was this question - how much slower does the algorithm work if n goes from ,say, 90 to 900,000 ? My doubts were :

  1. The algorithm has a running time of O(n^3) , so obviously it will take more time for a larger input. But how do I compare an algorithm's performance for different inputs based on just the worst case time complexity ?
  2. Can we just plug in the values of 'n' and divide the big-O to get a ratio ?

Please correct me wherever I am mistaken ! Thanks !

Utsav T
  • 1,515
  • 2
  • 24
  • 42
  • 3
    I think the answers (and probably the question) confuse Big-Oh with Big-Theta (or ignore the difference); while programmers usually mean Big-Theta when they talk about worst case running times for algorithms but use Big-Oh, this is a bit of a subtlety that you may want to look out for. Also, trying to get specific values out of Landau notation is a bit of a fool's errand, anyway. It hides lower-order terms, so anything specific values you derive from it will have error attached to them, sometimes significant. For a ballpark figure, you can do what you suggested, though. – G. Bach Jul 21 '14 at 00:56
  • @G.Bach-THANKS,I have revised the answer and included your name and hinted to your beautiful comment.Please comment about the answer below my answer.Also,point out any serious mistakes,if found! – Am_I_Helpful Jul 21 '14 at 04:24
  • Your understanding is wrong, because you ignored the initial sentence for any of these complexity claims: "There is an n_0 so that for all n > n_0 the following holds". If your n_0 is large, and that can easily be hundreds of thousands elements in reality, your approach fails. – Ulrich Eckhardt Jul 21 '14 at 05:58

2 Answers2

1

Obviously,your thinking is correct!

As the algorithm goes for the size(n),IN THE WORST-CASE,if size is increased the speed will go inversely with n^3.

Your assumption is correct.You just divide after putting values for n=900,000 by n=90 and obtain the result.This factor will be the slowing factor!

Here,slowing factor = (900,000)^3 / (90)^3 = 10^12 !

Hence, slow factor=10^12 .Your program will slow down by a factor of 10^12 !!!Such a drastic change!Or in other words,your efficiency will fall 10^(-12) times!!!

EDIT based on suggested comments :-

But how do I compare an algorithm's performance for different inputs based on just the worst case time complexity ?

As hinted by G.Bach,one of the commentator on this post,The basic idea underlying your question is itself contradictory!You should have talked about Big-Theta Notation,instead of Big-O to think of a general solution! Big-O is an upper-bound whereas Big-Theta is a tight bound. When people only worry about what's the worst that can happen, big-O is sufficient; i.e. it says that "it can't get much worse than this". The tighter the bound the better, of course, but a tight bound isn't always easy to compute.

So,in worst case analysis your question would be answered the way both of us have answered.Now,one narrow reason that people use O instead of Ω is to drop disclaimers about worst or average cases.But,for a tighter analysis you'll have to check for both O and big-Omega as well to frame a question. This question can suitably be found solution for varied size of n though.I leave it for you to figure out.But,if you have any doubts,PLEASE,feel free to comment.

Allso,your running time is not directly related to worst case analysis,but,somehow it is related and can be reframed this way!

P.S.---> I have taken some ideas/statements from Big-oh vs big-theta. So,THANKS to the authors too!

I hope it helps.Feel free to comment if any info skimmed!

Community
  • 1
  • 1
Am_I_Helpful
  • 18,735
  • 7
  • 49
  • 73
  • Not necessarily. Neither 90 nor 900,000 equals infinity. – Oliver Charlesworth Jul 20 '14 at 21:05
  • @OliCharlesworth-Why SIR,he has already mentioned that it is dependent on n and is an algorithm affected by (n^3)? – Am_I_Helpful Jul 20 '14 at 21:08
  • I think the problem is that the answers to the two questions go in opposite directions: (1) You're correct, `O(n^3)` doesn't tell you about the actual running time. (2) You're correct, treat `n^3` as the actual running time. I think OP is looking for how to reconcile those two statements. – Teepeemm Jul 21 '14 at 01:18
1

Your understanding is correct.

  1. If the only performance measure you have is worst-case time complexity, that's all you can compare. If you had more information you could make a better estimate.

  2. Yes, just substitute the values of n and divide 900,0003 by 903 to get the ratio.

Bill the Lizard
  • 398,270
  • 210
  • 566
  • 880
  • Thanks for answering. But even if I showed you the algorithm - say for example Insertion Sort in C , how would it help in a better estimate ? By taking into consideration all the steps and individual time complexities - like O(n^2) would actually have been O(n^2 + n + 3) ..? – Utsav T Jul 20 '14 at 21:20
  • @Noob For specific values of n, I guess that would work. As n increases, eventually the first term will overpower all of the others, though. I was thinking in terms of best-case and average-case time complexity. If you have those, you can give an estimated range of expected performance. – Bill the Lizard Jul 20 '14 at 21:26