1

Say I have a master script that runs weekly via cronjob. This script imports a bunch of different functions from other Python files and runs their functions in sequence. I'd also like to be able to run a couple of the functions the master script runs but ad-hoc from the Terminal. What is the best way to construct both the master script and the individual files containing functions to be run? Example of current situation:

master_script.py

import do_bulk_things as b
import do_another_thing as a

b.do_first_bulk_thing()
b.do_second_bulk_thing()
if b.do_third_bulk_thing():
    a.my_other_thing()

do_bulk_thinkgs.py

def  do_first_bulk_thing():
    # Code

def do_second_bulk_thing():
    # Code

def do_third_bulk_thing():
    # Code
    if successful:
        return True

do_another_thing.py

def my_other_thing():
    # Code

If I want to run my_other_thing() without running the entire master_script.py, how and where should I be defining and calling everything? The imported files just have function definitions so I can't actually execute any function by running python do_another_thing.py; and I also shouldn't execute the function my_other_thing() within do_another_thing.py because then it will run on import. It seems to me that I need to restructure things, but I need some best practices.

nshaas
  • 114
  • 2
  • 11
  • Do you know how to use `sys.argv` and `argparse`? You could have `master_script.py` run your standard set of functions as a function, e.g. `standard()` by default (when no command line arguments are received) and then if you provide some command line arguments, have it run a different function `other_things()`. If that sounds useful and you don't know how to do it I will write up an answer. I'm assuming you do not want to do these ad-hoc activities from a python session. That would be easier and a different solution. – KobeJohn Apr 18 '14 at 00:57
  • @kobejohn I know what `argv` does, but haven't used it or `argparse` before. [Documentation](https://docs.python.org/3/library/argparse.html) on `argparse` is more than enough to show me the way. If using cmd line arguments is a better practice or better way than `__main__` for a particular reason then I'd like to hear why. It seems like using args would require a more modification of the existing code but would in the end provide more flexibility down the road. – nshaas Apr 18 '14 at 14:41
  • If you can use a python session to do your ad-hoc work, then yes, `__main__` detection was the other solution I was talking about and it's easier than worrying about parsing command line arguments. Because you are working with cron-jobs I thought maybe constraints prevent you from working in a python session. I'll comment on your answer if I have any improvements. – KobeJohn Apr 18 '14 at 15:21

1 Answers1

1

Going to attempt to answer my own question after some more research, which then lead me here. Execute the defined and imported functions within do_bulk_thinks.py and do_another_thing.py, but use __main__ to stop the functions from running when they're imported. So master_script.py remains unchanged, but the other files would have:

do_bulk_things.py

def  do_first_bulk_thing():
    # Code

def do_second_bulk_thing():
    # Code

def do_third_bulk_thing():
    # Code
    if successful:
        return True

if __name__ == '__main__':
    do_first_bulk_thing()
    do_second_bulk_thing()
    do_third_bulk_thing()

And do_another_thing.py

def my_other_thing():
    # Code

if __name__ == '__main__':
    my_other_thing()
Community
  • 1
  • 1
nshaas
  • 114
  • 2
  • 11
  • You've got the main idea. Actually in this particular example, the `__main__` detection is not necessary because you are only defining functions. Nothing will change for you. However if you have some other code inside of `bulk` or `another`, then that could be wrapped up in a `main()` to avoid being run when you just want to import the functions. – KobeJohn Apr 18 '14 at 15:26