Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[python][fix] check whether config has enable_lora attribute #2616

Merged
merged 1 commit into from
Dec 2, 2024

Conversation

sindhuvahinis
Copy link
Contributor

@sindhuvahinis sindhuvahinis commented Dec 2, 2024

Description

Neuronx unit tests cases are failing because of this. enable_lora is only available in lmi_dist and vllm as of now. Lora is not supported in neuron yet.

@sindhuvahinis sindhuvahinis requested review from zachgk and a team as code owners December 2, 2024 18:45
@sindhuvahinis sindhuvahinis changed the title [python] check whether config has enable_lora attribute [python][fix] check whether config has enable_lora attribute Dec 2, 2024
@@ -198,7 +198,8 @@ def add_server_maintained_params(request_input: RequestInput,

def parse_adapters(request_input: TextInput, input_item: Input,
input_map: Dict, **kwargs):
if kwargs.get("configs").enable_lora:
configs = kwargs.get("configs")
if hasattr(configs, "enable_lora") and configs.enable_lora:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if hasattr(configs, "enable_lora") and configs.enable_lora:
if getattr(configs, "enable_lora", False):

@sindhuvahinis sindhuvahinis merged commit 3e38ebf into deepjavalibrary:master Dec 2, 2024
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants