Tool to create ML-powered prediction services that are ready to deploy and scale
Questions tagged [bentoml]
11 questions
2
votes
0 answers
How to configure BentoML docker container to write log messages to a file in a mounted dir?
I'm not able to configure BentoML logs to be written in a mounted directory in my docker container.
The BentoML Logging documentation shows no parameter to write the logs to a file.
The Legacy property mapping doc page talks about a log dir, but if…

neves
- 33,186
- 27
- 159
- 192
1
vote
0 answers
C# grpc client to BentoML serving service
Is there a way to inference to BentoML service by using C# gRPC client?
I've searched and as far as I understood, I need a proto file for C# gRPC client but have no idea how to create one for BentoML service. I am trying to connect to BentoML…

Youngwook Jun
- 51
- 4
0
votes
0 answers
Asynchronous Processing Challenge with BentoML: Need Advice on Handling Concurrent Requests
I've been working with BentoML and encountered a challenge. I'm hoping to get some assistance on this matter. I've developed multiple APIs using BentoML, and the general structure of my script looks like the following:
There are two functions, main1…

율링율링
- 1
0
votes
0 answers
Bentoml model supports rabbitmq message broker integration
Does Bentoml model supports rabbitmq message broker integration?
I had gone through many articles, I didn't find any documentation related to bentoml and rabbitmq integration.
Any reference article or code snippet would be helpfull regarding…
0
votes
1 answer
How to change where models and builds are saved by BentoML?
By default, builds are stored at the ~/bentoml/bentos directory. Can I change it? There's no command line option.

neves
- 33,186
- 27
- 159
- 192
0
votes
1 answer
BentoML CORS error on loading swagger docs in k8s deployment
This is similar to https://github.com/bentoml/BentoML/issues/1909, except i am doing k8s deployment and couldn't figure out where to put bento_config.yaml stuff in k8s config. Current config
apiVersion: serving.yatai.ai/v2alpha1
kind:…

Yogen Rai
- 2,961
- 3
- 25
- 37
0
votes
1 answer
BentoML: how to build a model without importing the services.py file?
Is it possible to run bentoml build without importing the services.py file during the process?
I'm trying to put the bento build and containarize steps in our CI/CD server. Our model depends on some OS packages installed and some python packages. I…

neves
- 33,186
- 27
- 159
- 192
0
votes
1 answer
BentoML - Seving a CatBoostClassifier with cat_features
I am trying to create a BentoML service for a CatBoostClassifier model that was trained using a column as a categorical feature. If i save the model and I try to make some predictions with the saved model (not as a BentoML service) all works as…

andreigeorgiu
- 3
- 2
0
votes
1 answer
bentoml containarize fails with "Unknown flag: mount"
I'm trying to run BentoML tutorial to package and serve a machine learning model. It fails when I run bentoml containerize iris_classifier:latest
The error message is:
[+] Building 0.1s (2/2) FINISHED …

neves
- 33,186
- 27
- 159
- 192
0
votes
0 answers
Trying to deploy an ml model as an api through bentoml
import bentoml
import numpy as np
from bentoml.io import NumpyNdarray
# Get the runner
xgb_runner = bentoml.models.get("xgb_booster:latest").to_runner()
# Create a Service object
svc = bentoml.Service("xgb_classifier", runners=[xgb_runner])
#…
0
votes
1 answer
BentoML localhost server returns with TypeError: 'RunnerMethod' object is not callable
I am currently trying to deploy my Machine Learning model using BentoML.
I have wrapped my model in a runner using estimator_runner = bentoml.keras.get(BENTO_MODEL_TAG).to_runner()
However, when the service runs estimator_runner.predict(input_data)…

FLOROID
- 13
- 6