1

What are the best practices for importing applications that share the same code resources?


Imagine I have an automated stock trader that contains two services that run independently (different machines). The two services are:

  • collection_service - collects stock prices every minute and stores it to a SQL database
  • decision_making - makes a decision every ten minutes (based on the collected data) whether or to buy a stock.

With the desire to maintain SSOT they both use the same SQL table models (say SQLalchemy models), however they each have different dependencies. I addition they all use code thats's written by my company in different projects.

My repository looks like this:

─my_companies_repo
    ├───auto_trader
    │   ├───collection
    │   │       main_collection.py
    │   │       requirements.txt
    │   │
    │   ├───db_manage
    │   │       sql_models.py
    │   │
    │   └───decision_making
    │           main_decision.py
    │           requirements.txt
    │
    └───common

How would import statements will look like, should I pass several PYTHONPATHs when running the application or have one root?

For example in:

main_decision.py

from auto_trader.db_manage.sql_models
# or pass two PYTHONPATH's (one for common and one for auto_trader) and do this:
from db_manage.sql_models
mkrieger1
  • 19,194
  • 5
  • 54
  • 65
moshevi
  • 4,999
  • 5
  • 33
  • 50
  • What do you mean by "pass two PYTHONPATHs"? PYTHONPATH is an environment variable. It is not "passed", and there is only one. – mkrieger1 Jan 16 '20 at 09:31
  • Does this answer your question? [What is the best project structure for a Python application?](https://stackoverflow.com/questions/193161/what-is-the-best-project-structure-for-a-python-application) – mkrieger1 Jan 16 '20 at 09:32
  • Or rather [What is the proper way to work with shared modules in Python development?](https://stackoverflow.com/questions/17174992/what-is-the-proper-way-to-work-with-shared-modules-in-python-development) – mkrieger1 Jan 16 '20 at 09:34
  • 1
    It tackles a similar problem however, In my use case this will result in four repo, one for company code, one for db manage one for collection and one for decision_making, seems wonky to me. – moshevi Jan 16 '20 at 09:45

2 Answers2

1

Keep things simple: use absolute imports as much as possible, if not everywhere.

To do this correctly you need to figure out what are your top-level packages (and modules). Two cases:

  • either you package and install your project correctly, in which case you import from the site packages;
  • or you don't install your project and you import from the current working directory.

Looks like you are in the second case, and you want your top-level packages to be auto_trader and common. So write your imports like the following:

from auto_trader.db_manage import sql_models
from common import foo

Then make sure the current working directory is my_companies_repo and then call your main modules like this:

python3 -m auto_trader.collection.main_collection
python3 -m auto_trader.decision_making.main_decision

And lastly, never modify the PYTHONPATH environment variable. If you feel like you need to, then it is most likely that you should spend some time correctly packaging your Python code into an installable project and install it in the site packages with pip.

sinoroc
  • 18,409
  • 2
  • 39
  • 70
  • This is a good reminder that when using `python -m` you need to use `.` to properly address your target app instead of `/` – tekneee Oct 22 '21 at 07:24
0

As your code -layout is 3 packages ... But possibly in just 1 code package.... you can consider doing something like this

Create a Setup.py

It will look something like this

#!/usr/bin/env python

from setuptools import setup

setup(
    name='stocks',
    version='0.3',
    description='foo',
    author='bar',
    packages = ['my_companies_repo.auto_trader.collection',
                'my_companies_repo.auto_trader.db_manage',
              'my_companies_repo.auto_trader.decision_making'])

First step if to build a Python Distribution package

python setup.py sdist

That just created a file something like this

project/sdist/stocks-03.tar.gz 

You now move that file to your 3 servers where this parts are being hosted

  • Collection
  • db_manage
  • decision_making

So

scp /sdist/stocks-03.tar.gz server:~/

Now you log into those machines - and then install

pip3 install stocks-03.tar.gz 

You have to do this on each machine (this gets tedious - there is a was to solve... but I am trying to keep things simple).

At this point the same software is now on all the machines ...

So how do we run it ....

on the Collection machine

python3 -m 'my_companies_repo.auto_trader.collection'  

on the Decision machine

python3 -m 'my_companies_repo.auto_trader.decision_making'
Tim Seed
  • 5,119
  • 2
  • 30
  • 26
  • when you say "there is a way to solve" do you mean some sort of CI/CD (like jenkins) ? – moshevi Jan 19 '20 at 12:58
  • CI/CD will handle the build/test/package build .... but great care needs to be taken when deploying. You need to consider what happens if something breaks - a web based company may choose to deploy - if it fails you then rollback. A company that uses IT to support its operations - is less likely to do this - and using a Beta/Alpha testing platform is a more likely deployment scenario. – Tim Seed Jan 21 '20 at 00:07