My company is in the process of updating our legacy Python 2.x scripts to Python 3, and we're running into some speed bumps while trying to be properly Pythonic in our update process.
When using setuptools to create a console_scripts
entry point, there are two issues that we experience:
- Environment Variables: Some scripts rely on environment variables, for example anything that uses cx_Oracle needs an LD_LIBRARY_PATH variable to work, as the module no longer finds the Instant Client libraries automatically (cx_Oracle docs). I want to set this variable for each script, and not OS-wide. I have tried various ways of setting it in the Python itself, but none seem to work, instead throwing DPI-1047
libclntsh.so: cannot open shared object file: No such file or directory
stack trace and exiting. - Lack of main methods: Many of our older scripts may only have a
if __name__ == "__main__":
guard, instead of amain()
method, or some even have neither. When updating to work with an entry point, amain()
needs to be created, and possibly many variables need to be re-scoped or simply set to global. This is turning out to be very time consuming and error-prone.
Because of these hurdles, I'm beginning to wonder if console_scripts
entry points are the best solution for our updates. The "python-packaging" guide (Packaging docs) indicates that the scripts
keyword argument can be used for non-python scripts. I could use a bash wrapper to set the environment variable, but then I'm unsure of the correct way to call the Python, as it will be installed as a module. Calling with the full path to the site-packages directory doesn't seem right...
Is there a better way to approach this that I am missing?
We are using conda
(tar.bz2 files) for our package format, but I don't believe that's relevant as we're also using setuptools
Any input is greatly appreciated!