You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi all, I am new to KServe and Modelmesh Serving and try to get to know the software. I am trying to combine Kubeflow with Modelmesh Serving by using the following pipeline in Kubeflow:
Preprocess data
Train model
Deploy model on Modelmesh
With Kubeflow the output of each of the steps is a .tgz file having the model data that Modelmesh requires. The problem is that serving the model is not working if you point directly to the .tgz file. I am guessing that Modelmesh cannot handle that format. The problem, however, is that I have a full repository of models already in s3 which are compressed in tar.gz.
It would be very useful for me if Modelmesh can handle these extensions. I am not sure if I am missing something, and that it should actually work. Does anyone know if this is possible currently?
Hi @Thijsvandepoll, this is actually something we had intended to add support for, there is a related in-progress PR here.
In general the plan is to unify the storage handling between modelmesh-serving and single-model kserve. Have a common library and configuration shared by both.
Hi all, I am new to KServe and Modelmesh Serving and try to get to know the software. I am trying to combine Kubeflow with Modelmesh Serving by using the following pipeline in Kubeflow:
With Kubeflow the output of each of the steps is a .tgz file having the model data that Modelmesh requires. The problem is that serving the model is not working if you point directly to the .tgz file. I am guessing that Modelmesh cannot handle that format. The problem, however, is that I have a full repository of models already in s3 which are compressed in tar.gz.
It would be very useful for me if Modelmesh can handle these extensions. I am not sure if I am missing something, and that it should actually work. Does anyone know if this is possible currently?
Note: I can see in KServe's documentation (https://kserve.github.io/website/0.8/modelserving/storage/uri/uri/#train-and-freeze-the-model_1) that it would be possible to do that.
The text was updated successfully, but these errors were encountered: