Let's say I have Python process 1 on machine 1 and Python process 2 on machine 2. Both processes are the same and process data sent by a load balancer.
Both processes need to interact with a database - in my case Postgres so each process needs to know what database it should talk to, it needs to have the right models on each machine etc. It's just too tightly coupled.
The ideal would be to have a separate process dealing with the database stuff like connections, keeping up with db model changes, requests to the databases etc. What my process 1 and process 2 should do is just say I have some JSON data that needs to be saved or updated on this table or I need this data in json format.
Maybe I'm asking the impossible but is there any Python solution that would at least make life a little easier when it comes to having distributed processes interacting with relational databases in the most decoupled way possible?