You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is this a new feature, an improvement, or a change to existing functionality?
Change
How would you describe the priority of this feature request
Medium
Please provide a clear description of problem this feature solves
I want to be able to instantiate LLM services/clients dynamically, so that models can be swapped out with command line arguments or config files instead of having to refactor pipeline code.
## Description
Adds the ability to instantiate LLM services/clients dynamically, so
that models can be swapped out with command line arguments or config
files instead of having to refactor pipeline code.
Closes#1630
## By Submitting this PR I confirm:
- I am familiar with the [Contributing
Guidelines](https://github.com/nv-morpheus/Morpheus/blob/main/docs/source/developer_guide/contributing.md).
- When the PR is ready for review, new or existing tests cover these
changes.
- When the PR is ready for review, the documentation is up to date with
these changes.
Is this a new feature, an improvement, or a change to existing functionality?
Change
How would you describe the priority of this feature request
Medium
Please provide a clear description of problem this feature solves
I want to be able to instantiate LLM services/clients dynamically, so that models can be swapped out with command line arguments or config files instead of having to refactor pipeline code.
Describe your ideal solution
llm_service = LLMService.create(service, **service_kwargs)
llm_client = llm_service.get_client(**model_kwargs)
Additional context
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: