58

The docs only say that Python interpreter performs "basic optimizations", without going into any detail. Obviously, it's implementation dependent, but is there any way to get a feel for what type of things could be optimized, and how much run-time savings it could generate?

Is there any downside to using -O?

The only thing I know is that -O disables assert, but presumably one shouldn't use assert for things that could still go wrong in production.

max
  • 49,282
  • 56
  • 208
  • 355
  • 1
    possible duplicate of [What is the use of Python's basic optimizations mode? (`python -O`)](http://stackoverflow.com/questions/1693088/what-is-the-use-of-pythons-basic-optimizations-mode-python-o) – tzot Feb 17 '11 at 08:52

1 Answers1

61

In Python 2.7, -O has the following effect:

In addition -OO has the following effect:

To verify the effect for a different release of CPython, grep the source code for Py_OptimizeFlag.

Link to official documentation: https://docs.python.org/2.7/tutorial/modules.html#compiled-python-files

Aurora0001
  • 13,139
  • 5
  • 50
  • 53
Martin v. Löwis
  • 124,830
  • 17
  • 198
  • 235
  • 1
    Is there any downside to the -O flag apart from missing on the built-in debugging information? – max Jan 24 '11 at 02:17
  • 14
    I've seen many python modules that assume docstrings are available, and would break if that optimization level is used, for instance at the company where I work, raw sql is placed in docstrings, and executed by way of function decorators (not even kidding). Somewhat less frequently, `assert` is used to perform logic functions, rather than merely declare the invariant expectations of a point in code, and so any code like that would also break. – SingleNegationElimination Jan 24 '11 at 03:12
  • 1
    @max: if you go through the complete list of semantical changes above: do you consider any of them a "downside"? If not, there are no downsides. I personally consider it a disadvantage that the name of byte code files changes - it contributes to disk clutter. Notice that "missing built-in debugging information" is *not* in the list; pdb continues to work fine (this was not the case in earlier Python releases, where -O dropped support for single-stepping in pdb). – Martin v. Löwis Jan 24 '11 at 07:43
  • How much optimization can it really offer? Obviously, depends on the code, interpreter, environment, but perhaps some general comments apply? – max Jan 27 '11 at 00:47
  • 13
    @max: in general, I wouldn't expect any significant change in speed. Optimized byte code was originally designed to drop the inefficient SETLINENO byte code instruction, which was needed for single-stepping. However, single-stepping since got reimplemented using a more efficient approach, so -O has lost its point. – Martin v. Löwis Jan 27 '11 at 06:01
  • 11
    At least in current CPython versions, `__debug__` doesn't just get changed to `False`, any code under `if __debug__` is entirely stripped out. – kindall Nov 18 '13 at 16:57
  • 1
    the same behaviour for Python 3? – Serge Feb 13 '14 at 10:25
  • Same for python 3 as far as I know, though I believe the new object model allows for even more inlining of `__debug__`-based control flows.. Don't quote me on it ;) Either way, it should be pretty similar. Some things cpy2 can't (or doesn't) optimize: short-circuit boolean operations, `if not __debug__`, inline `if` statements. – DylanYoung Dec 14 '17 at 21:39