Skip to content

Commit

Permalink
Fixing OpenAPI spec for get requests and fern v.
Browse files Browse the repository at this point in the history
1. Fixing OpenAPI spec for get requests.
2. Adding Auth to Get as needed
3. Fixing OpenAPI spec for emeddings
4. Fixing Prompt Formatting Error and fixing link.
5. Updating Fern Version
  • Loading branch information
edmcquinn committed Jul 25, 2024
1 parent e6ab143 commit 0352410
Show file tree
Hide file tree
Showing 3 changed files with 560 additions and 626 deletions.
22 changes: 10 additions & 12 deletions fern/docs/pages/options/prompts.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,21 +10,19 @@ follow a specific prompt format. These models are fine-tuned using prompt data,
and if you match your prompt formats to that training data format then you can
see boosts in performance.

Check out the [model details page](details) to learn which prompt formats match
Check out the [model details page](models) to learn which prompt formats match
certain LLMs. We've included some of the most important prompt formats below.

import { Callout } from "nextra-theme-docs";

<Callout type="info" emoji="ℹ️">
<Info>
**Note on Chat model prompts** - For your convenience, we automatically apply
the right prompt formats when you supply a `messages` object to our
`/chat/completions` endpoint or via the `client.chat.completions.create()` method in the
Python client. You don't have to add in special tokens or apply the below
prompt formats as this will duplicate the formatting. However, if you want to
use chat-tuned models in the `/completions` endpoint or via the
`client.completions.create()` method, you should apply the appropriate one of the
below prompt formats.
</Callout>
`/chat/completions` endpoint or via the `client.chat.completions.create()`
method in the Python client. You don't have to add in special tokens or apply
the below prompt formats as this will duplicate the formatting. However, if
you want to use chat-tuned models in the `/completions` endpoint or via the
`client.completions.create()` method, you should apply the appropriate one of
the below prompt formats.
</Info>

## Alpaca

Expand Down Expand Up @@ -108,4 +106,4 @@ DDL statements:
{create_table_statements}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
The following SQL query best answers the question {user_question}:
```
```
2 changes: 1 addition & 1 deletion fern/fern.config.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
{
"organization": "Prediction-Guard",
"version": "0.30.0"
"version": "0.35.0"
}
Loading

0 comments on commit 0352410

Please sign in to comment.