The Cython project I am currently working on includes some 20 C++ files that act as C++ implementation. On top I have built three Extension Modules written in Cython, each of them capturing different but interconnected functionalities of the C++ implementation. In order to wrap the required functionalities form C++ I used one single .pxd
file that defines all the cdef extern
blocks. Later on, the 3 .pyx
files with the 3 Extension Modules (one Extension Module per .pyx file) cimport the pxd
wrapper and add extra functionality.
Now, my objective is to pack the whole library and distribute it using PyPi. My problem arises when using setup.py: I don't know how to make the C++ code available for all Extension Modules. For development I have installed the C++ as a dynamic library in Linux, which clearly has a lot of portability issues. I am more than willing to export the C++ source code to be compiled locally (thus creating a source distribution using python setup.py sdist
).
These are the approaches I've taken so far, all of them failed:
- Including the C++ source files in the
sources
of all extensions. It compiles all the C++ files for each Extension. This creates multiple copies of global objects. - Including the C++ source files in the most important Extension and importing it always first. The other Extension modules throw
undefined symbol
errors because they don't have access to the C++ implementation. - Creating a 4th Extension module with the name of the Cython
.pxd
wrapper and importing it first. Same result as above (#2).
Any tips will be highly appreciated!