KServe enables serverless inferencing on Kubernetes and provides performant, high abstraction interfaces for common machine learning (ML) frameworks like TensorFlow, XGBoost, scikit-learn, PyTorch, and ONNX to solve production model serving use cases.
Questions tagged [kubeflow-kserve]
9 questions
2
votes
1 answer
KNative and gRPC with nginx-ingress
I have installed Knative/KServe in my AWS EKS cluster. Everything is working fine, but recently we decided to try gRPC for our services deployed there. It's deployed with Istio,winth nginx ingress in front of everything, with ingress pointing to…

dmitrii
- 31
- 1
1
vote
0 answers
Can I deploy kserve inference service using XGBoost model on kserve-tritonserver?
I want to deploy XGBoost model on kserve.
I deployed it on default serving runtime. But I want to try it on kserve-tritonserver.
I know kserve told me kserve-tritonserver supports Tensorflow, ONNX, PyTorch, TensorRT. And NVIDIA said triton inference…

HoonCheol Shin
- 11
- 2
1
vote
1 answer
I got stuck at 404 during the tutorial on deploying kserve models using the sklearn-iris example
I added ‘+ New Endpoint’ to Endpoints in the kubeflow dashboard and registered the resources below.
kind: "InferenceService"
metadata:
annotations:
isdecar.istio.is/inject: "false"
name: "sklearn-iris"
spec:
predictor:
sklearn:
…

TaeUk Noh
- 96
- 5
1
vote
0 answers
Got HTTPConnectionPool when I request(post) to kserve example model with python SDK
I runed code case1, case2 in local space with vscode, not Notebooks of kubeflow censtral dashboard.
I'd like to know what I'm missing.
prepared:
added user
- email: my_name@gmail.com
hash: 1234 # Actually, hash value
…

TaeUk Noh
- 96
- 5
0
votes
0 answers
Isito Ingress Controller VirtualService returning 404 for kserve service
I have installed kserve 0.9 and then deployed a tutorial kserve service using-
apiVersion: serving.kserve.io/v1beta1
kind: InferenceService
metadata:
name: sklearn-iris
namespace: kserve-test
labels:
app: sklearn-iris
spec:
predictor:
…

Shauryagoel
- 29
- 1
- 7
0
votes
1 answer
ConnectionResetError (104, 'Connection reset by peer) in kserve?
I have A Hugging face model https://huggingface.co/TahaDouaji/detr-doc-table-detection in Torch serve it is working fine when I deployed it in locally. I tried to deploy the model in kserve as DockerImage. Pods are running fine without any error i…
0
votes
1 answer
I run the kserve example for sklearn-iris and got `302 Found`
serving exam model
create namespace
$ kubectl create namespace kserve-test
create InferenceService
$ kubectl apply -n kserve-test -f - <

TaeUk Noh
- 96
- 5
0
votes
2 answers
Serving custom model from Kubeflow pipeline
I have a kubeflow pipeline which trains custom (i.e. not based on sklearn / tensorflow etc. classes) ml model. Now I would like to add serving at the end of the pipeline. I.e. I want to have a service in my Kubernetes cluster which uses the model to…

Tomasz Cakala
- 1
- 1
0
votes
2 answers
how can we specify version of tensorflow serving in kubeflow?
I am trying to use tensorflow serving to serve a model. when I try to apply serve component using ksonnet, I see that workload created on kubernetes(gke) is using tensorflow…

rhg
- 173
- 2
- 16