-
Notifications
You must be signed in to change notification settings - Fork 81
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Remove extra
python
dir from the LLama Guard extension (#653)
Remove extra `python` dir from the LLama Guard extension I also made the READMEs clearer and easier to understand. I'll publish this after
- Loading branch information
Showing
8 changed files
with
23 additions
and
27 deletions.
There are no files selected for viewing
File renamed without changes.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
Please see our [`LLama-Guard/python` | ||
dir](https://github.com/lastmile-ai/aiconfig/tree/main/extensions/LLama-Guard/python) for the README file. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,20 +1,28 @@ | ||
# LLama Guard with AIConfig | ||
|
||
LLama Guard is a 7b model released by Meta. This extension allows you to use it with AIConfig. | ||
Note: This extension also loads the entire model into memory. | ||
|
||
## Usage | ||
LLaMA Guard allows you to define your own “safety taxonomy” — custom policies to determine which interactions are safe vs. unsafe between humans (prompts) and AI models (responses). What makes this cool is that it allows you to enforce your own policies _ON TOP_ of the standard guardrails that a model ships with (instead of merely overriding them). | ||
|
||
### Installation, Importing, and using this extension | ||
> [!NOTE] This extension also loads the entire model into memory. | ||
1. run `pip install aiconfig_extension_llama_guard` in your shell | ||
2. `from aiconfig_extension_llama_guard import LLamageGuardParser` | ||
3. In code, construct and load the model parser that to from this extension to the registry: `ModelParserRegistry.register_model_parser(LLamageGuard())`. You can read the docstrings under `ModelParserRegistry` class for more info | ||
## Part 1: Installating, Importing, and using this extension | ||
|
||
1. Install this module: run `pip3 install aiconfig_extension_llama_guard` in terminal | ||
2. Add these lines to your code: | ||
|
||
## Local Testing | ||
### Update and test this extention | ||
```python | ||
from aiconfig_extension_llama_guard import LLamageGuardParser | ||
from aiconfig.registry import ModelParserRegistry | ||
``` | ||
|
||
1. Navigate to `extensions/LLama-Guard/python`, run this command: `pip install build && cd python && python -m build && pip install dist/*.whl` | ||
2. After you're done testing, be sure to delete the generated `dist` folder(s) in the same dir. It'll probalby look something like `python/dist` and `python/<package_name>.egg-info` | ||
3. In code, construct and load the model parser that to from this extension to the registry: `ModelParserRegistry.register_model_parser(LLamageGuard())`. You can read the docstrings under `ModelParserRegistry` class for more info o nwhat this does. | ||
4. Use the `LLamageGuard` model parser however you please. Check out our tutorial to get started ([video walkthrough](https://www.youtube.com/watch?v=XxggqoqIVdg), [Jupyter notebook](https://github.com/lastmile-ai/aiconfig/tree/v1.1.8/cookbooks/LLaMA-Guard)) You can watch our video tutorial or check our Jupyter notebook tuto | ||
|
||
## Part 2: Updating & Developing this extension | ||
|
||
If you are not developing this extension locally (just using the published extension), feel free to ignore this part | ||
|
||
1. Navigate to `extensions/LLama-Guard/python` and run this command: `pip3 install -e .` (this creates a local copy of the python module which is linked to this directory) | ||
2. Edit and test the extension as you please. Feel free to submit a push request on GitHub! | ||
3. After you're done testing, be sure to uninstall the local link to this directory if you ever want to use the published version: `pip3 uninstall aiconfig_extension_llama_guard` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -3,12 +3,13 @@ requires = ["setuptools", "wheel"] | |
|
||
[project] | ||
name = "aiconfig_extension_llama_guard" | ||
version = "0.0.2" | ||
version = "0.0.3" | ||
authors = [ | ||
{ name="LastMile AI" }, | ||
{ name="Ankush Pala", email="[email protected]" }, | ||
{ name="Rossdan Craig", email="[email protected]" }, | ||
] | ||
description = "An extension for using llama-guard with aiconfig" | ||
description = "An extension for using LLama Guard with aiconfig" | ||
readme = "README.md" | ||
requires-python = ">=3.10" | ||
classifiers = [ | ||
|
This file was deleted.
Oops, something went wrong.
File renamed without changes.
File renamed without changes.
File renamed without changes.