-
Notifications
You must be signed in to change notification settings - Fork 81
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[editor] Fix Model Parsers prompt_metadata remember_chat_context #1209
Conversation
}, | ||
}, | ||
required: ["model", "max_tokens_to_sample", "stop_sequences"], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is just UI for now right? Doesn't actually enforce the required-ness?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ya, just shows the red required * for the setting input. Also, just noting this was existing code
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
@@ -14,7 +14,7 @@ export const ClaudeBedrockPromptSchema: PromptSchema = { | |||
type: "string", | |||
}, | |||
max_tokens_to_sample: { | |||
type: "number", | |||
type: "integer", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice catch! did you look at the docs for this or did an error pop up when testing locally?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just noticed it when testing locally that it allowed setting floats which didn't make sense in my mind, so checked to confirm before updating
# [editor] Add 'Metadata' Tab to Prompt Side Pane Currently the editor does not provide a way to modify general prompt metadata, outside of model settings and parameters. For some of our chat parsers, this means the user cannot modify the 'remember_chat_context' setting. This PR implements the UX for modifying this metadata, adding it as a tab in the side panel and associated with prompt_metadata in the prompt schema: https://github.com/lastmile-ai/aiconfig/assets/5060851/353e2e47-2210-4351-96ef-81eea03c2aab ![Screenshot 2024-02-09 at 11 03 06 AM](https://github.com/lastmile-ai/aiconfig/assets/5060851/cd371457-9fd1-45d4-996e-df470e048196) ![Screenshot 2024-02-09 at 11 03 11 AM](https://github.com/lastmile-ai/aiconfig/assets/5060851/82d646e5-e96a-427c-902e-319f16580b71) Make sure generic metadata can be set as well: <img width="437" alt="Screenshot 2024-02-09 at 11 22 55 AM" src="https://github.com/lastmile-ai/aiconfig/assets/5060851/6e4fbe08-280c-4fa2-9610-4ee86e1cfc2d">
# [editor][ez] Only Render SidePanel Tab Content When Active Tab I noticed each of the tab components was being rendered even when they're not shown. We can improve render perf by only rendering them when their tab is selected. ## Testing: - Just ensure tabs still work - Use console.log to ensure the components are only rendered when shown
# [editor] Fix Model Parsers prompt_metadata remember_chat_context Update the relevant model parsers to have default and description for remember_chat_context. Also, remove it from the ones that don't actually support it in their parser
[editor] Fix Model Parsers prompt_metadata remember_chat_context
[editor] Fix Model Parsers prompt_metadata remember_chat_context
Update the relevant model parsers to have default and description for remember_chat_context. Also, remove it from the ones that don't actually support it in their parser
Stack created with Sapling. Best reviewed with ReviewStack.