I have a python module which contains a C++ extension as well a shared library on which the C++ extension depends. The shared library is picked up by setuptools as an extra_object on the extension. After running python setup.py bdist_wheel
, the module wheel object gets properly generated has a directory structure as follows:
+-pymodule.whl
| +-pymodule
| +-pymodule-version.dist-info
| +-extension.so
| +-shared_lib.so
To install this wheel, in my python environment I call pip install pymodule.whl
which copies the python sources as well as the .so files into the environment's site-packages
directory.
After installing the module, one can then attempt to import the module by calling import pymodule
in a terminal for the environment. This triggers an exception to be thrown:
ImportError: shared_lib.so: cannot open shared object file: No such file or directory
This exception can be resolved by appending the appropriate site-packages
directory to the LD_LIBRARY_PATH
variable; however, it seems that this should work out of the box, especially considering that python is clearly able to locate extension.so
.
Is there a way to force python to locate this shared library without having to explicitly point LD_LIBRARY_PATH
at the installation location(i.e. site-packages
)?
This question works around a similar problem by using package data and explicitly specifying an install location for the shared library. The issue I have with this approach is that the shared object is decoupled from the extension. In my case, the shared library and extension are both targets built by the same cmake build. I had previously attempted to use skbuild
to build cmake based extentions; however, as per this issue, there is a similar issue in skbuild with including other libraries generated as part of the extension build.