diff --git a/docs/docs/integrations/providers/predictionguard.mdx b/docs/docs/integrations/providers/predictionguard.mdx index 5e01eeef14dbe..542c20d077e42 100644 --- a/docs/docs/integrations/providers/predictionguard.mdx +++ b/docs/docs/integrations/providers/predictionguard.mdx @@ -4,99 +4,56 @@ This page covers how to use the Prediction Guard ecosystem within LangChain. It is broken into two parts: installation and setup, and then references to specific Prediction Guard wrappers. ## Installation and Setup -- Install the Python SDK with `pip install predictionguard` -- Get a Prediction Guard access token (as described [here](https://docs.predictionguard.com/)) and set it as an environment variable (`PREDICTIONGUARD_TOKEN`) -## LLM Wrapper - -There exists a Prediction Guard LLM wrapper, which you can access with -```python -from langchain_community.llms import PredictionGuard +- Install the Python SDK: ``` - -You can provide the name of the Prediction Guard model as an argument when initializing the LLM: -```python -pgllm = PredictionGuard(model="MPT-7B-Instruct") +pip install predictionguard ``` -You can also provide your access token directly as an argument: -```python -pgllm = PredictionGuard(model="MPT-7B-Instruct", token="") -``` +- Get a Prediction Guard API key (as described [here](https://docs.predictionguard.com/)) and set it as an environment variable (`PREDICTIONGUARD_API_KEY`) -Finally, you can provide an "output" argument that is used to structure/ control the output of the LLM: -```python -pgllm = PredictionGuard(model="MPT-7B-Instruct", output={"type": "boolean"}) -``` +## Prediction Guard Langchain Integrations +|API|Description|Endpoint Docs|Import|Example Usage| +|---|---|---|---|---| +|Completions|Generate Text|[Completions](https://docs.predictionguard.com/api-reference/api-reference/completions)|`from langchain_community.llms.predictionguard import PredictionGuard`|[predictionguard.ipynb](/docs/integrations/llms/predictionguard)| +|Text Embedding|Embed String to Vectores|[Embeddings](https://docs.predictionguard.com/api-reference/api-reference/embeddings)|`from langchain_community.embeddings.predictionguard import PredictionGuardEmbeddings`|[predictionguard.ipynb](/docs/integrations/text_embedding/predictionguard)| -## Example usage +## Getting Started -Basic usage of the controlled or guarded LLM wrapper: -```python -import os +## Embedding Models -import predictionguard as pg -from langchain_community.llms import PredictionGuard -from langchain_core.prompts import PromptTemplate -from langchain.chains import LLMChain - -# Your Prediction Guard API key. Get one at predictionguard.com -os.environ["PREDICTIONGUARD_TOKEN"] = "" - -# Define a prompt template -template = """Respond to the following query based on the context. - -Context: EVERY comment, DM + email suggestion has led us to this EXCITING announcement! 🎉 We have officially added TWO new candle subscription box options! 📦 -Exclusive Candle Box - $80 -Monthly Candle Box - $45 (NEW!) -Scent of The Month Box - $28 (NEW!) -Head to stories to get ALL the deets on each box! 👆 BONUS: Save 50% on your first box with code 50OFF! 🎉 - -Query: {query} - -Result: """ -prompt = PromptTemplate.from_template(template) - -# With "guarding" or controlling the output of the LLM. See the -# Prediction Guard docs (https://docs.predictionguard.com) to learn how to -# control the output with integer, float, boolean, JSON, and other types and -# structures. -pgllm = PredictionGuard(model="MPT-7B-Instruct", - output={ - "type": "categorical", - "categories": [ - "product announcement", - "apology", - "relational" - ] - }) -pgllm(prompt.format(query="What kind of post is this?")) +### Prediction Guard Embeddings + +See a [usage example](/docs/integrations/text_embedding/predictionguard) + +```python +from langchain_community.embeddings.predictionguard ``` -Basic LLM Chaining with the Prediction Guard wrapper: +#### Usage ```python -import os +# If predictionguard_api_key is not passed, default behavior is to use the `PREDICTIONGUARD_API_KEY` environment variable. +embeddings = PredictionGuardEmbeddings(model="bridgetower-large-itm-mlm-itc") -from langchain_core.prompts import PromptTemplate -from langchain.chains import LLMChain -from langchain_community.llms import PredictionGuard +text = "This is an embedding example." +output = embeddings.embed_query(text) +``` -# Optional, add your OpenAI API Key. This is optional, as Prediction Guard allows -# you to access all the latest open access models (see https://docs.predictionguard.com) -os.environ["OPENAI_API_KEY"] = "" -# Your Prediction Guard API key. Get one at predictionguard.com -os.environ["PREDICTIONGUARD_TOKEN"] = "" -pgllm = PredictionGuard(model="OpenAI-gpt-3.5-turbo-instruct") +## LLMs +### Prediction Guard LLM -template = """Question: {question} +See a [usage example](/docs/integrations/llms/predictionguard) -Answer: Let's think step by step.""" -prompt = PromptTemplate.from_template(template) -llm_chain = LLMChain(prompt=prompt, llm=pgllm, verbose=True) +```python +from langchain_community.llms import PredictionGuard +``` -question = "What NFL team won the Super Bowl in the year Justin Beiber was born?" +#### Usage +```python +# If predictionguard_api_key is not passed, default behavior is to use the `PREDICTIONGUARD_API_KEY` environment variable. +llm = PredictionGuard(model="Hermes-2-Pro-Llama-3-8B") -llm_chain.predict(question=question) +llm.invoke("Tell me a joke about bears") ``` diff --git a/docs/docs/integrations/text_embedding/predictionguard.ipynb b/docs/docs/integrations/text_embedding/predictionguard.ipynb index cbb8a2f8f3173..b09d5bd924c1b 100644 --- a/docs/docs/integrations/text_embedding/predictionguard.ipynb +++ b/docs/docs/integrations/text_embedding/predictionguard.ipynb @@ -42,18 +42,18 @@ { "metadata": { "ExecuteTime": { - "end_time": "2024-10-08T18:59:10.422135Z", - "start_time": "2024-10-08T18:59:10.419563Z" + "end_time": "2024-11-08T16:20:01.598574Z", + "start_time": "2024-11-08T16:20:01.595887Z" } }, "cell_type": "code", "source": [ "import os\n", "\n", - "os.environ[\"PREDICTIONGUARD_API_KEY\"] = \"\"" + "os.environ[\"PREDICTIONGUARD_API_KEY\"] = \"