You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? If so, please describe.
MLflow models are not currently supported as a modelFormat in modelmesh, despite KServe offering an InferenceService for MLflow models. Is there a particular reason for this limitation?
Describe your proposed solution
A custom serving runtime can be created using the MLFlowRuntime from MLServer. However, I was wondering if there could be a more streamlined or built-in way to support MLflow models directly in ModelMesh considering the above.
The text was updated successfully, but these errors were encountered:
Strongly recommand support for mlflow models;
We using mlflow as our model registry, and deploy with kserve inferenceservice, we hope modelmesh can support MLFLow model format as inferenceservice, then we can use it
Is your feature request related to a problem? If so, please describe.
MLflow models are not currently supported as a
modelFormat
in modelmesh, despite KServe offering anInferenceService
for MLflow models. Is there a particular reason for this limitation?Describe your proposed solution
A custom serving runtime can be created using the MLFlowRuntime from MLServer. However, I was wondering if there could be a more streamlined or built-in way to support MLflow models directly in ModelMesh considering the above.
The text was updated successfully, but these errors were encountered: