Skip to content

Commit

Permalink
Merge pull request #30 from predictionguard/Actually-Fixing-Get-Requests
Browse files Browse the repository at this point in the history
Fixing OpenAPI spec for get requests and fern v.
  • Loading branch information
jmansdorfer authored Jul 25, 2024
2 parents e6ab143 + a47c6e3 commit aaebc34
Show file tree
Hide file tree
Showing 3 changed files with 559 additions and 625 deletions.
22 changes: 10 additions & 12 deletions fern/docs/pages/options/prompts.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,21 +10,19 @@ follow a specific prompt format. These models are fine-tuned using prompt data,
and if you match your prompt formats to that training data format then you can
see boosts in performance.

Check out the [model details page](details) to learn which prompt formats match
Check out the [model details page](models) to learn which prompt formats match
certain LLMs. We've included some of the most important prompt formats below.

import { Callout } from "nextra-theme-docs";

<Callout type="info" emoji="ℹ️">
<Info>
**Note on Chat model prompts** - For your convenience, we automatically apply
the right prompt formats when you supply a `messages` object to our
`/chat/completions` endpoint or via the `client.chat.completions.create()` method in the
Python client. You don't have to add in special tokens or apply the below
prompt formats as this will duplicate the formatting. However, if you want to
use chat-tuned models in the `/completions` endpoint or via the
`client.completions.create()` method, you should apply the appropriate one of the
below prompt formats.
</Callout>
`/chat/completions` endpoint or via the `client.chat.completions.create()`
method in the Python client. You don't have to add in special tokens or apply
the below prompt formats as this will duplicate the formatting. However, if
you want to use chat-tuned models in the `/completions` endpoint or via the
`client.completions.create()` method, you should apply the appropriate one of
the below prompt formats.
</Info>

## Alpaca

Expand Down Expand Up @@ -108,4 +106,4 @@ DDL statements:
{create_table_statements}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
The following SQL query best answers the question {user_question}:
```
```
2 changes: 1 addition & 1 deletion fern/fern.config.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
{
"organization": "Prediction-Guard",
"version": "0.30.0"
"version": "0.35.0"
}
Loading

0 comments on commit aaebc34

Please sign in to comment.