I maintain a Cython binding to some OCaml code (through their respective C interface). For past versions, I managed to cheat and distribute a wheel file for Windows through cross-compilation. Now, I finally managed a clean and native way to produce the library for Windows 64 bits.
For the 32 bits cross-compiled version, I had a specific target in my setup.py
with the proper commands to execute. Back on Windows, I would like to stick to a setuptoolsic way of doing, but the thing is I need to replace the regular linking command link.exe
with a different tool (resp. flexlink.exe
, shipped with OCaml on Windows)
Don't panic: flexlink.exe
just builds some assembler shit before compiling and linking with the regular link.exe
. It is the proper way to link OCaml executables and shared libraries under Windows.
For MacOS and Linux, the traditional Extension
pattern works like a charm as follows (mlobject
is produced by OCaml a bit earlier in the file after some timestamp checks, asmrunlib
is the full path to the equivalent of python36.dll
for OCaml) :
extensions = [
Extension("foo",
["foo.pyx", "interface_c.c"],
language="c",
include_dirs=INCLUDE,
extra_compile_args=compileargs,
extra_link_args=[mlobject, asmrunlib, ]
)
]
Let's say I limit myself to Python>=3.5, I guess (by comparison with too big projects like NumPy) I would need to start by extending distutils._msvccompiler.MSVCCompiler
and replace the self.linker = _find_exe("link.exe", paths)
with something based on flexlink.exe
.
The problem is that I have no idea how they manage the plumbing work that comes next (connecting this extended compiler and making it look like the regular msvc to the setup process). I suppose it is not thoroughly documented anywhere and that if they were able to do more than that in NumPy, I should be able to reach my goal somehow.
My setup.py
is still reasonably basic and a solution that keeps the whole building/packaging process in one single file would be great!