My use case is the following:
- several scripts that can be run individually
- one 'mother' script that can run all the scripts
- each script has global variables it uses (paths to specific locations, constants used by all scripts)
- some of these global variables are inputted in a settings.yaml file which gets dynamically loaded with
yaml.load()
- several people are running the system, we want to avoid environment variables or any kind of low-level path settings
The use cases were satisfied with an
import settings
at the beginning of each export module.
The settings.py script is our 'initialize' script that loads and create variables used for all the modules, e.g.:
import os
import yaml
from utils import create_dir
input_f = open("settings.yml")
settings = yaml.load(input_f.read())
input_f.close()
TOHELLO = settings['TOHELLO']
TOBONJOUR = settings['TOBONJOUR']
#check for needed directories and create if necessary
create_dir(TOHELLO)
create_dir(TOBONJOUR)
The 'mother module' export_all.py uses Subprocess.Popen()
to run all the scripts sequentially.
Originally all files were in the root:
+--settings.py
+--settings.yml
+--export_foo1.py
+--export_foo2.py
+--export_foo3.py
+--export_foo4.py
+--export_bar1.py
+--export_bar2.py
+--export_bar3.py
+--export_bar4.py
+--export_all.py
With time, though, this became messy and we decided to restructure it:
+--settings.py
+--settings.yml
+--Foo
+ +--export_foo1.py
+ +--export_foo2.py
+ +--export_foo3.py
+ +--export_foo4.py
+--Bar
+ +--export_bar1.py
+ +--export_bar2.py
+ +--export_bar3.py
+ +--export_bar4.py
+--export_all.py
And then all hell broke loose.
We discovered that one can only use 'import' for modules that are either underneath the module or specifically within sys.path.
The solution we've come up with involves inserting this beauty at the top of all of our scripts:
PACKAGE_PARENT = os.path.join('..','..')
SCRIPT_DIR = os.path.dirname(os.path.realpath(os.path.join(os.getcwd(), os.path.expanduser(__file__))))
sys.path.append(os.path.normpath(os.path.join(SCRIPT_DIR, PACKAGE_PARENT)))
but it just seems like overkill to us.
We tried
from settings import functionname
but that doesn't run the code that generates the constants from the yaml file
Also, we tried squeezing the import
s in __init.py__ in each directory, but then the Subprocess.Popen()
didn't work.
We're kind of missing a pythonesque way to have some global constants, and an initialiazing script that runs only once (even when the 'mother' script is running).
Is it us, or is python just weird and ugly here?