I have some .proto
gRPC files I want to compile as part of the setup.py script. This requires running from grpc_tools import protoc
and calling protoc
before setup(args)
. The goal is to compile and install the pb files from pip install pkgname
.
E.g.
# setup.py
# generate our pb2 files in the temp directory structure
compile_protobufs(pkgname)
# this will package the generated files and put them in site-packages or .whl
setup(
name=pkgname,
install_requires=['grpcio-tools', ...],
...
)
This works as intended, I get the pb files in my site-packages or in the wheel without them having to exist in the source folder. However, this pattern means I cannot naively pip install pkgname
from scratch, as the step compile_protobufs
depends on grpcio-tools
, which does not get installed until setup()
.
I could use setup_requires, but that is on the chopping block. I could just install the dependencies first (right now I use RUN pip install -r build-require.txt && pip install pkgname/
), but it still seems like there ought to be a cleaner way.
Am I even going about this pattern correctly or am I missing some packaging idiom?
My criteria:
- Generally this is run inside a container, so minimizing external deps
- I want the
_pb2.py
files regenerated each time Ipip install
- These files need to also make their way into any
.whl
or tar.