We have a Python application with over twenty modules, most of which are shared by several web and console applications.
I've never had a clear understanding of the best practice for establishing and managing database connection in multi module Python apps. Consider this example:
I have a module defining an object class for Users. It has many defs for creating/deleting/updating users in the database. The users.py module is imported into a) a console based utility, 2) a web.py based web application and 3) a constantly running daemon process.
Each of these three application have different life cycles. The daemon can open a connection and keep it open. The console utility connects, does work, then dies. Of course the http requests are atomic, however the web server is a daemon.
I am currently opening, using then closing a connection inside each function in the Users class. This seems the most inefficient, but it works in all examples. An alternative used as a test is to declare and open a global connection for the entire module. Another option would be to create the connection at the top application layer and pass references when instantiating classes, but this seems the worst idea to me.
I know every application architecture is different. I'm just wondering if there's a best practice, and what it would be?