5

I'm writing a bunch of python scripts that all work together, for instance tool1.py loads some data into a database table, tool2.py reads from this table, does some calculations and writes the results into another table, and daemon1.py is a webserver serving the result. Each tool and daemon has a whole bunch of support files (that are needed only for that one tool), and needs its own directory. In addition, I do have some code shared between all tools, such as config.py and database.py. Intuitively, I structured the project thus:

/README.md
/requirements.txt
/config.py # contains shared configuration
/database.py # shared code to connect to the database
/tool1/
    tool1.py # entrypoint
    [...] # bunch of other files only needed by tool1
/tool2/
    tool2.py #entrypoint
    [...] # bunch of other files only needed by tool2
/deamon1/
    daemon1.py #entrypoint
    [...] # bunch of other files only needed by daemon1

I then run my tools and daemons with the command python tool1/tool1.py. The problem however here is how tool1.py has access to config.py/database.py. I considered the following options, would be interested in what is considered the "right" way in python, or any alternatives I might have missed (perhaps a different way to lay out the project). Extra karma will be rewarded for a link to an authoritative answer.

  1. symlink the config.py/database.py files into the subdirectories. Don't like it much since it confuses my editor, and seems to make things more complicated than necessary.
  2. make config.py/database.py in some separate package, which I install in my virtualenv. Don't like it since I'm constantly changing these files as well, and I want to keep them in the same git repo.
  3. change the sys.path at the top of tool1.py. This results in 4 lines at the top of each file, plus importing of the sys and os modules for no other reason than to set these items.
import os
import sys
sys.path.append(os.path.join(os.path.abspath(
    os.path.dirname(__file__)), ".."))
  1. Add the toplevel path to $PYTHONPATH
  2. create toplevel entrypoints for tool1.py/tool2.py/daemon1.py that read something like (after renaming the tool1 directory to tool1dir)
from tool1dir import tool1
tool1.run()
  1. Put config.py/database.py into a separate package and symlink that directory from each subdir.

As noted, would like to hear the pythonesque way to do this, or any suggestions or preferences.

Claude
  • 8,806
  • 4
  • 41
  • 56

1 Answers1

-1

From Shinbero on another question:

import sys
sys.path.insert(0,'..')
import database

The directory can be returned to its place by adding:

sys.path.insert(0,'tool1') #tool1 is replaced with your directory you are calling from.

A variation of your 3rd method was used to solve this question.

DucksEL
  • 57
  • 5
  • I don't see how this is any nicer than option 3 I mentioned (rather, I think it's less nice, but that might be a matter of taste). However I was hoping for an answer explaining the best pythonic way, as I already have 6 solutions. Also, to replace the directory, why not just remove the added `..` dir? – Claude May 24 '22 at 07:27
  • 1: If you already have solutions then I don't have any idea why you're asking in the first place 2: the `..` is automatically removed when added – DucksEL May 24 '22 at 16:46