I have a kubeflow pipeline which trains custom (i.e. not based on sklearn / tensorflow etc. classes) ml model. Now I would like to add serving at the end of the pipeline. I.e. I want to have a service in my Kubernetes cluster which uses the model to answer prediction requests and this service should be updated with a new model after each pipeline run.
As far as I know to serve a custom model I should:
Wrap my model into kfserving.KFModel class
Create docker image with the wrapper from 1) running
Create InferenceService endpoint with image from 2)
Is there any cloud agnostic way to do this in a Kubeflow component? (so basically the component must be able to build docker images)
Is there some better way to achieve my purpose?
Maybe I should move steps 1-3 outside of pipeline component and just create a component which would trigger external execution of 1-3. Can this be done?