0

A package that I am writing is evermore expanding, but a lot of functions take the same arguments (e.g. something like overwrite=True). I was thinking about ways to reduce some of the redundancy by setting default arguments globally. I have tried with a decorator (that's what they are for, right? [seen here])

so, instead of

def function1(unique_arg1=1, unique_arg2=2, overwrite=True, feedback=False):
    print(locals())
    pass

function1()
        

I have tried something like this

## define in settings.py    
default_args_dict = {
    "overwrite": True,
    "feedback": False}

def add_defaults(func):
    def inner(default_args=default_args_dict):
        return func(**default_args)
    return inner

## script.py
from settings import add_defaults

@add_defaults
def function1(unique_arg1=1, unique_arg2=1, **kwargs):
    print(locals())
    pass
    
function1()

Unfortunately, passing an optional argument doesn't work anymore:

## doesn't work 
function1(overwrite=False)

TypeError: inner() got an unexpected keyword argument 'overwrite'

Can somebody help me here? And, more importantly, is this a terrible idea?

mluerig
  • 638
  • 10
  • 25
  • Maybe put the args into an object, and have a constructor for that object that provides the defaults? – Samwise Mar 20 '21 at 20:36

1 Answers1

1

The problem is, default_args=default_args_dict is a keyword argument called default_args that you're defaulting to a dictionary. You should instead take keyword arguments, and merge your defaults with what's passed.

def inner(**default_args):
    default_args = default_args_dict | default_args
    return func(**default_args)

If you're on an older version of Python, instead of the | line, you can use

default_args = {**default_args_dict, **default_args}
miquelvir
  • 1,748
  • 1
  • 7
  • 21
Carcigenicate
  • 43,494
  • 9
  • 68
  • 117