0

EDIT looking for good intro to time complexity.

I've been doing a lot of Project Euler problems and I've got some questions about performance. I could write a whole bunch of scripts to test different theories on which things take longer than others but I'd rather read it from a pro. Being an amateur enthusiast, I probably don't even know the questions to ask or theories to test. Therefore, I'm looking for a resource that can shed light on what kinds of things will take longer/use more CPU cycles in Python. Is there a good place to go for "rule of thumb" efficiency information like that?

JB0x2D1
  • 838
  • 8
  • 20

1 Answers1

3

Unfortunately, there are no rules of thumb like that. The actual performance of actual programs running on actual computers is way too complicated to capture in simple rules.

For predicting whether one piece of code will be feasible to run at all (which is what you need for Project Euler), complexity theory — "big Oh notation" — is the tool of choice. Rigorous analysis, or just intuition, can tell you that (and why) some procedure will take at least 2^n steps for inputs of size n. Knowing the order of magnitude of n, one can then easily conclude that the procedure will not finish quickly enough.

But this mathematical tool says little to nothing about actual running time: An algorithm taking O(n) time might be much faster than an algorithm taking O(log n) time for all n that are ever used. Plus, while there are some rules for estimating complexity ("nested loop = quadratic", a common one that also breaks down for a lot of useful programs), you need an intuition anyway and once you have that, all those rules of thumb become obsolete.

  • 2
    I'd add that as far as finding out "The actual performance of actual programs running on actual computers", this http://stackoverflow.com/questions/582336/how-can-you-profile-a-python-script is a good place to take a look. – Tim Wilder Nov 30 '13 at 20:17