Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for mlflow models #527

Open
liaspas opened this issue Oct 5, 2024 · 1 comment
Open

Support for mlflow models #527

liaspas opened this issue Oct 5, 2024 · 1 comment

Comments

@liaspas
Copy link

liaspas commented Oct 5, 2024

Is your feature request related to a problem? If so, please describe.

MLflow models are not currently supported as a modelFormat in modelmesh, despite KServe offering an InferenceService for MLflow models. Is there a particular reason for this limitation?

Describe your proposed solution
A custom serving runtime can be created using the MLFlowRuntime from MLServer. However, I was wondering if there could be a more streamlined or built-in way to support MLflow models directly in ModelMesh considering the above.

@innovation-derby
Copy link

Strongly recommand support for mlflow models;
We using mlflow as our model registry, and deploy with kserve inferenceservice, we hope modelmesh can support MLFLow model format as inferenceservice, then we can use it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants