15

I have a project depending on a shared library. To make it clear from the beginning: the shared library is a pure C library and not a Python library. For reasons of simplicity I created a small demo project called pkgtest which I will refer to.

So what needs to be done is: Run a Makefile to compile the library and place the compiled shared library (called libhello.so here) file somewhere it can be accessed from within the depending Python package.

My best guess so far was to run the makefile as a preinstallation routine, copy the libhello.so file in the packages directory and add it to the package_data parameter of the setup script. When installed the shared library then gets placed in the site-packages/pkgtest/ directory and can be accessed from the module.

The package directory is structure is as simple as this:

pkgtest/
  src/
     libhello.c
     libhello.h
     Makefile
  pkgtest/
    __init__.py
    hello.py
  setup.py

My setup.py looks like this:

setup.py

import subprocess
from setuptools import setup
from distutils.command.install import install as _install


class install(_install):
    def run(self):
        subprocess.call(['make', 'clean', '-C', 'src'])
        subprocess.call(['make', '-C', 'src'])
        _install.run(self)


setup(
    name='pkgtest',
    version='0.0.1',
    author='stefan',
    packages=['pkgtest'],
    package_data={'pkgtest': ['libhello.so']},
    cmdclass={'install': install},
)

The Makefile actually builds the library and copies it into the directory of my python package.

src/Makefile

all: libhello.so

libhello.o: libhello.c
        gcc  -fPIC -Wall -g -c libhello.c

libhello.so: libhello.o
        gcc -shared -fPIC -o libhello.so libhello.o
        cp libhello.so ../pkgtest/libhello.so

clean:
        rm -f *.o *.so

So all hello.py is actually doing is load the library and call the hello function that prints some text. But for completeness I will show the code here:

pkgtest/hello.py

import os
import ctypes

basedir = os.path.abspath(os.path.dirname(__file__))
libpath = os.path.join(basedir, 'libhello.so')

dll = ctypes.CDLL(libpath)

def say_hello():
    dll.hello()

So this actually works but what I don't like about this approach is that the shared library lives in the directory of the Python package. I figure it would be better to put it in some sort of central library directory such as /usr/lib/. But for this one would need root privileges on installation. Has somebody got some experience with this kind of problem and would like to share a solution or helpful idea. Would be great.

MrLeeh
  • 5,321
  • 6
  • 33
  • 51
  • 1
    Have you considered using a package manager like [conda](https://conda.io/miniconda.html)? You could create a separate package for your library and for the Python code, and specify the library as a dependency of the Python code in the `meta.yaml` file. – ostrokach Jul 15 '17 at 18:36
  • 1
    I didn't think about conda, yet. You are right, this would definetly be an option, however not my first choice as I want the package to be available via `pip install ...`. – MrLeeh Jul 15 '17 at 18:47
  • In that case, I don't think having a compiled library file inside your python package folder is a problem. That's where [Python extension modules](https://docs.python.org/3/extending/building.html) and [Cython](http://docs.cython.org/en/latest/index.html) modules are placed. But maybe there's a better solution... – ostrokach Jul 15 '17 at 19:08
  • I see no reason why to have a Makefile in the first place. Simple C object files can be compiled and installed by setuptools and the like automatically. See [https://docs.python.org/3/extending/building.html](here). – user2722968 Jul 15 '17 at 19:15
  • 1
    I use the library above only for example. The actual library is more complex with multiple source files and linking. I didn't see how to to this with setuptools. – MrLeeh Jul 15 '17 at 19:41
  • Similar to: https://stackoverflow.com/q/31380578/1959808 – 0 _ Sep 11 '17 at 13:09
  • Consider doing this entirely with [CFFI](https://cffi.readthedocs.io/en/latest/overview.html) – o11c Apr 26 '20 at 19:12

2 Answers2

5

You can create a Python package which includes shared libraries and works on (almost) any linux distro using manylinux.

The goal of the manylinux project is to provide a convenient way to distribute binary Python extensions as wheels on Linux. This effort has produced PEP 513 which defines the manylinux1_x86_64 and manylinux1_i686 platform tags.

The general procedure is:

  1. Build the external library and the Python package inside one of the docker containers provided by the manylinux team (see python-manylinux-demo)
  2. Run auditwheel repair to copy the external shared libraries that your package depends on into the Python wheel, setting the RPATH accordingly.

See .travis.yml and build-wheels.sh in the python-manylinux-demo repo for an example.

ostrokach
  • 17,993
  • 11
  • 78
  • 90
  • 4
    This looks great. If I get it right the example shows how to build a Python C extension and include that in the wheel. As stated in the question the aim is not to build a Python C extension but build and include a pure C shared library. I still can't figure out how to manage that. – MrLeeh Jul 26 '17 at 07:58
  • @MrLeeh : I am also searching solution to almost identical solution (Fortran instead of C). Have you found a workaround? – Peaceful Nov 21 '20 at 14:18
  • 1
    What is the process to include internal [instead of external] shared libraries in the `bdist_wheel`? – Chaitanya Bapat Jan 11 '21 at 04:56
1

package_data is for data. setup can do the compilation into an *.so.

Following my solution in python setup.py build ctypes.CDLL: cannot open shared object file: No such file or directory, I think your setup could employ ext_modules and py_modules, something like

setup(name='pkgtest',
      py_modules=['pkgtest'],
      ext_modules=[Extension('src.libhello', ['src/libhello.c'])]
     )