I wish to use some C and CUDA code in my Python package (which I then call using ctypes). Because of the CUDA, it doesn't seem to be easy to use the traditional approach of a setuptools Extension, so I instead pre-compile the code to shared libraries and then wish to include them in a Wheel. When I build the Wheel it includes the shared libraries, but it still gives the output Wheel a "none-any" extension, indicating that it is pure Python and platform independent, which is not correct. cibuildwheel then refuses to run auditwheel on it because of this. I have read posts about forcing Wheels to be labelled as platform dependent, but I imagine that I shouldn't have to force it and that I am instead doing something wrong.
Directory structure:
- mypackage
- pyproject.toml
- setup.cfg
- src
- mypackage
- __init__.py
- aaa.py
- bbb.c
- ccc.cu
- libmypackage_cpu.so
- libmypackage_cuda.so
- before_all.sh (script to build the .so files)
- mypackage
pyproject.toml:
[build-system]
requires = ["setuptools>=42"]
build-backend = "setuptools.build_meta"
[tool.cibuildwheel]
build = "cp36-*"
skip = ["pp*", "*686"]
manylinux-x86_64-image = "manylinux2014"
[tool.cibuildwheel.linux]
before-all = "bash {project}/src/mypackage/before_all.sh"
setup.cfg:
[metadata]
name = mypackage
[options]
package_dir =
= src
include_package_data = True
zip_safe = False
packages = find:
python_requires = >=3.6
[options.packages.find]
where = src
[options.package_data]
mypackage =
*.so
Am I missing something that is causing the packager to not detect that the Wheel is not pure Python and platform independent, or is forcing the packager to label it otherwise (such as by creating an empty Extension) the normal approach?