1

I am attempting to change how the Python bindings for a "large" C++ software project (i.e. the bindings are a minor part) are shipped so that the bindings are no longer installed as bare, metadata-free .so files in site-packages. The bindings' .so files are already built as part of the rest of the software's mess of cmake instructions, and I'm adding instructions and the proper module structure (i.e. <module>/__init__.py) so that setuptools can be in charge of the install step. In setup.py, I have the prebuilt .so files included as package_data, as was done in the related question Distributing pre-built libraries with python modules. And when cmake && make && make install gets around to invoking python setup.py build and python setup.py install --root=${CMAKE_INSTALL_PREFIX}, everything technically works. Hooray.

The problem is that python setup.py install ... drops everything under {PREFIX}/usr/lib/pythonN.M/site-packages even though I've included binaries built for 64-bit arch in the package data. I'm struggling to figure out how to get setuptools to nicely install to, e.g. for Linux, /usr/lib64. I could possibly add some cmake logic to figure out what the libdir should be and pass it in as --install-lib to setup.py install, but it seems like the right thing to do is somehow make setuptools aware, during the install step, that the contents of the package are platform-specific and to set the install location accordingly.

I assume this is largely because I'm including the bindings as package data rather than building them inside the setup script as extension modules. Is there some way to tell setuptools that a package without explicit ext_modules is platform-specific so that python setup.py install places files under /usr/lib64/... instead of /usr/lib/...?

jpatton
  • 394
  • 3
  • 17

0 Answers0