3

I'm trying to make a deprecation system that allows code to run transparently for regular users, but flag deprecated objects in developer mode.

One issue that I'm having is that I can import a deprecated object into another module even if I'm in developer mode. This means that I'm missing places where the deprecated object is used.

For example in module1.py:

class MyObject(object):
    pass
MyObject = MyObject if not dev_mode() else DeprecatedObject

Then in module2.py:

from module1 import MyObject

I already have DeprecatedObject set up so that any interaction with it raises a DeprecationWarning - is there any way that I can make it error on import? ie. even importing module2.py would raise an exception.

I'm imagining something like:

import warnings

class DeprecatedObject(object):
    ...
    def __onimport__(self):
        warnings.warn("deprecated", DeprecationWarning)
wim
  • 338,267
  • 99
  • 616
  • 750
ninhenzo64
  • 702
  • 1
  • 9
  • 23
  • [An `__init__.py` file](https://docs.python.org/3/tutorial/modules.html#packages) might let you do what you want? – bsinky Jan 12 '18 at 18:09
  • 1
    Perhaps my own question; is it right to raise an _exception_ on importing deprecated methods/objects/modules? Why do they still exist in that case? – roganjosh Jan 12 '18 at 18:11
  • you can probably achieve this using class decorators: https://stackoverflow.com/questions/739654/how-to-make-a-chain-of-function-decorators/1594484#1594484 just have the decorator return the original class or return the DeprectedObject class depending on dev_mode , or just raise DeprecationWarning directly – AntiMatterDynamite Jan 12 '18 at 18:12
  • Is there any reason you don't just use an `if` at the module level? `if dev_mode()`, define a whole bunch of classes and methods with the deprecated functionality. `else:` define pristine versions of the same. – Mad Physicist Jan 12 '18 at 21:52

3 Answers3

2

The module level __getattr__ feature allows, among other things, for module level names to undergo a correct deprecation process at import time. This feature is coming in Python 3.7, see PEP 562 for details (since you've tagged with Python 2.7, it can't help you, but I mention it for the benefit of future readers).

On Python 2.7 you have two inferior options:

  • Trigger deprecation warning in the object __init__.
  • Use Guido's hack to replace the module with a patched version of itself after import. Wrapping a proxy object around the module allows you to control name resolution.
wim
  • 338,267
  • 99
  • 616
  • 750
1

First off, I recommend looking into the built-in warnings module. It has tools made specifically for this type of thing. Having a non-fatal warning in place makes more sense than raising an exception.

Now, for your case, one possible course of action would be to "replace" the deprecated class with a function. This means renaming the class to something else, and having a function with the original name which checks whether or not developer mode is enabled and acts accordingly. The result would be something like:

class MyDeprecatedClass:
    pass

def MyClass(*args, **kwargs):
    if dev_mode():
        raise DeprecationWarning
    else:
        return MyDeprecatedClass(*args, **kwargs)

Or, with warnings:

def MyClass(*args, **kwargs):
    from warnings import warn
    if dev_mode():
        warn("Dont use this!!!!!!!!!")
    else:
        return MyDeprecatedClass(*args, **kwargs)

What this does is it checks whether or not developer mode is enabled, and only raises the exception (or warning) if it is. Otherwise, it passes all the arguments given to it to the constructor of the renamed class, meaning all old that relies on it will work fine.

stelioslogothetis
  • 9,371
  • 3
  • 28
  • 53
0

Your initial approach is almost exactly what I would advise, except that you allow for both types of objects to exist simultaneously. I would start with a full-blown if statement in your module, that only allows one of the objects to be defined at a time. Something more like:

if dev_mode():
    class MyObject:
        # Define deprecated version here
        ...
else:
    class MyObject:
        # Define production version here
        ...

If the difference between the deprecated version and the non-deprecated version is something simple, e.g., that could be easily accomplished with a function or class decorator (like raising a warning), you could simplify the code above to something like:

if dev_mode():
    def function_decorator(func, cls=None):
        # You can use the second argument when calling manually from a class decorator
        name = func.__name__ is cls is None else cls.__name__ + '.' + func.__name__
        warnings.warn("Importing deprecated function: {}".format(name))
        return func

    def class_decorator(cls):
        warnings.warn("Importing deprecated class: {}".format(cls.__name__))
        # Make additional modifications here (like adding function_decorator to all the class methods)
        return cls
else:
    def function_decorator(func):
        return func
    def class_decorator(cls):
        return cls

@class_decorator
class MyClass:
    pass

Using a module-level if to avoid multiple versions of the class floating around is the basic tool here. You can add any number of layers of complexity to your process. One technique I have seen for a similar purpose (where the particular version of a class depends on some import-time condition like OS), is to create a package named module1, and implement two different versions of your classes in different modules entirely. The package structure would look like this:

module1/
|
+-- __init__.py
|
+-- _development.py
|
+-- _production.py

Both _development and _production define the same names, but different versions. The underscores in front of the module names imply that they should never be imported directly. You expose module1 as a module rather than as a package using its __init__ file, which would look something like this:

__all__ = ['MyModule']

if dev_mode():
    from ._development import MyModule
else:
    from ._production import MyModule

If you have a lot of names, you can automate the public import using __all__ in __init__:

import importlib, sys

__all__ = ['MyClass']

self = sys.modules[__name__]
sub = importlib.import_module('_development' if dev_mode() else '_production')
for name in __all__:
    setattr(self, name, getattr(sub, name))

This form of separation allows you to test both the production and the dev versions without having two separate test flows. Your tests can import the private modules directly.

Mad Physicist
  • 107,652
  • 25
  • 181
  • 264
  • As with the other answer, this requires two branches of testing (one for dev mode on and one for dev mode off) and a complicated setup of the test runner to avoid `dev_mode` being called at import time. Not ideal. – wim Jan 15 '18 at 21:45
  • @wim. The package version allows for a single test flow if you import the "private" submodules. I should explicitly state that in the answer. – Mad Physicist Jan 15 '18 at 22:12
  • Hmm, isn't this just moving code around? The same module level logical branch is still there in `__init__.py`. The long and short of it is the feature the OP needs simply doesn't exist in Python 2.7. And it's a useful feature, hence the PEP's acceptance - deprecation of module-level names was actually one of the main motivations mentioned. – wim Jan 15 '18 at 22:28
  • @wim, not quite. You now have two private modules that you can test individually. Not a huge improvement, but it should make at least some of the testing easier. – Mad Physicist Jan 15 '18 at 23:24