1

Is there a way to inference to BentoML service by using C# gRPC client?

I've searched and as far as I understood, I need a proto file for C# gRPC client but have no idea how to create one for BentoML service. I am trying to connect to BentoML service running with below code from "https://docs.bentoml.org/en/latest/tutorial.html"

 import numpy as np
 import bentoml
 from bentoml.io import NumpyNdarray

 iris_clf_runner = bentoml.sklearn.get("iris_clf:latest").to_runner()

 svc = bentoml.Service("iris_classifier", runners=[iris_clf_runner])

 @svc.api(input=NumpyNdarray(), output=NumpyNdarray())
 def classify(input_series: np.ndarray) -> np.ndarray:
     result = iris_clf_runner.predict.run(input_series)
     return result
  • 1
    You're correct that you will want to start with a proto to generate C# stubs. BentoML appears to not publish the proto for this service (e.g. see [`go.mod`](https://github.com/bentoml/BentoML/blob/main/grpc-client/go/go.mod). You should consider raising an [issue](https://github.com/bentoml/BentoML/issues) to ask them to add C# stubs (to the Bazel build) and/or publish the proto file(s). – DazWilkin Apr 03 '23 at 17:30

0 Answers0