Skip to content

Commit

Permalink
Reformat python code samples for text-generation pages
Browse files Browse the repository at this point in the history
  • Loading branch information
Max Shkutnyk committed Dec 18, 2024
1 parent 373b29b commit db4e025
Show file tree
Hide file tree
Showing 8 changed files with 220 additions and 181 deletions.
70 changes: 38 additions & 32 deletions fern/pages/text-generation/chat-api.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,14 +14,16 @@ The Chat API endpoint is used to generate text with Cohere LLMs. This endpoint f
<CodeBlocks>
```python PYTHON
import cohere

co = cohere.Client(api_key="<YOUR API KEY>")

response = co.chat(
model="command-r-plus-08-2024",
message="Write a title for a blog post about API design. Only output the title text."
model="command-r-plus-08-2024",
message="Write a title for a blog post about API design. Only output the title text.",
)

print(response.text) # "The Art of API Design: Crafting Elegant and Powerful Interfaces"
print(response.text)
# "The Art of API Design: Crafting Elegant and Powerful Interfaces"
```
```java JAVA
public class ChatPost {
Expand Down Expand Up @@ -106,20 +108,24 @@ The user message in the Chat request can be sent together with a `chat_history`

```python PYTHON
import cohere

co = cohere.Client(api_key="<YOUR API KEY>")

message = "Can you tell me about LLMs?"

response = co.chat(
model="command-r-plus-08-2024",
chat_history=[
{"role": "USER", "text": "Hey, my name is Michael!"},
{"role": "CHATBOT", "text": "Hey Michael! How can I help you today?"},
],
message=message
model="command-r-plus-08-2024",
chat_history=[
{"role": "USER", "text": "Hey, my name is Michael!"},
{
"role": "CHATBOT",
"text": "Hey Michael! How can I help you today?",
},
],
message=message,
)

print(response.text) # "Sure thing Michael, LLMs are ..."
print(response.text) # "Sure thing Michael, LLMs are ..."
```

Instead of manually building the chat_history, we can grab it from the response of the previous turn.
Expand All @@ -129,22 +135,21 @@ chat_history = []
max_turns = 10

for _ in range(max_turns):
# get user input
message = input("Send the model a message: ")

# generate a response with the current chat history
response = co.chat(
model="command-r-plus-08-2024",
message=message,
chat_history=chat_history
)

# print the model's response on this turn
print(response.text)

# set the chat history for next turn
chat_history = response.chat_history
# get user input
message = input("Send the model a message: ")

# generate a response with the current chat history
response = co.chat(
model="command-r-plus-08-2024",
message=message,
chat_history=chat_history,
)

# print the model's response on this turn
print(response.text)

# set the chat history for next turn
chat_history = response.chat_history
```

### Using `conversation_id` to Save Chat History
Expand All @@ -153,12 +158,13 @@ Providing the model with the conversation history is one way to have a multi-tur

```python PYTHON
import cohere

co = cohere.Client("<YOUR API KEY>")

response = co.chat(
model="command-r-plus-08-2024",
message="The secret word is 'fish', remember that.",
conversation_id='user_defined_id_1',
model="command-r-plus-08-2024",
message="The secret word is 'fish', remember that.",
conversation_id="user_defined_id_1",
)

answer = response.text
Expand All @@ -168,12 +174,12 @@ Then, if you wanted to continue the conversation, you could do so like this (kee

```python PYTHON
response2 = co.chat(
model="command-r-plus-08-2024",
message="What is the secret word?",
conversation_id='user_defined_id_1'
model="command-r-plus-08-2024",
message="What is the secret word?",
conversation_id="user_defined_id_1",
)

print(response2.text) # "The secret word is 'fish'"
print(response2.text) # "The secret word is 'fish'"
```

Note that the `conversation_id` should not be used in conjunction with the `chat_history`. They are mutually exclusive.
Expand Down
120 changes: 56 additions & 64 deletions fern/pages/text-generation/documents-and-citations.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,76 +22,68 @@ Here's an example of interacting with document mode via the Postman API service.

```python PYTHON
{
"message": "Where do the tallest penguins live?",
"documents": [
{
"title": "Tall penguins",
"snippet": "Emperor penguins are the tallest."
},
{
"title": "Penguin habitats",
"snippet": "Emperor penguins only live in Antarctica."
},
{
"title": "What are animals?",
"snippet": "Animals are different from plants."
}
],
"prompt_truncation": "AUTO"
"message": "Where do the tallest penguins live?",
"documents": [
{
"title": "Tall penguins",
"snippet": "Emperor penguins are the tallest.",
},
{
"title": "Penguin habitats",
"snippet": "Emperor penguins only live in Antarctica.",
},
{
"title": "What are animals?",
"snippet": "Animals are different from plants.",
},
],
"prompt_truncation": "AUTO",
}
```

Here's an example reply:

```python PYTHON
{
"response_id": "ea9eaeb0-073c-42f4-9251-9ecef5b189ef",
"text": "The tallest penguins, Emperor penguins, live in Antarctica.",
"generation_id": "1b5565da-733e-4c14-9ff5-88d18a26da96",
"token_count": {
"prompt_tokens": 445,
"response_tokens": 13,
"total_tokens": 458,
"billed_tokens": 20
},
"meta": {
"api_version": {
"version": "2022-12-06"
}
},
"citations": [
{
"start": 22,
"end": 38,
"text": "Emperor penguins",
"document_ids": [
"doc_0"
]
},
{
"start": 48,
"end": 59,
"text": "Antarctica.",
"document_ids": [
"doc_1"
]
}
],
"documents": [
{
"id": "doc_0",
"title": "Tall penguins",
"snippet": "Emperor penguins are the tallest.",
"url": ""
},
{
"id": "doc_1",
"title": "Penguin habitats",
"snippet": "Emperor penguins only live in Antarctica.",
"url": ""
}
],
"search_queries": []
{
"response_id": "ea9eaeb0-073c-42f4-9251-9ecef5b189ef",
"text": "The tallest penguins, Emperor penguins, live in Antarctica.",
"generation_id": "1b5565da-733e-4c14-9ff5-88d18a26da96",
"token_count": {
"prompt_tokens": 445,
"response_tokens": 13,
"total_tokens": 458,
"billed_tokens": 20,
},
"meta": {"api_version": {"version": "2022-12-06"}},
"citations": [
{
"start": 22,
"end": 38,
"text": "Emperor penguins",
"document_ids": ["doc_0"],
},
{
"start": 48,
"end": 59,
"text": "Antarctica.",
"document_ids": ["doc_1"],
},
],
"documents": [
{
"id": "doc_0",
"title": "Tall penguins",
"snippet": "Emperor penguins are the tallest.",
"url": "",
},
{
"id": "doc_1",
"title": "Penguin habitats",
"snippet": "Emperor penguins only live in Antarctica.",
"url": "",
},
],
"search_queries": [],
}
```

Expand Down
65 changes: 39 additions & 26 deletions fern/pages/text-generation/retrieval-augmented-generation-rag.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -18,16 +18,27 @@ The code snippet below, for example, will produce a grounded answer to `"Where d

```python PYTHON
import cohere

co = cohere.Client(api_key="<YOUR API KEY>")

co.chat(
model="command-r-plus-08-2024",
message="Where do the tallest penguins live?",
documents=[
{"title": "Tall penguins", "snippet": "Emperor penguins are the tallest."},
{"title": "Penguin habitats", "snippet": "Emperor penguins only live in Antarctica."},
{"title": "What are animals?", "snippet": "Animals are different from plants."}
])
model="command-r-plus-08-2024",
message="Where do the tallest penguins live?",
documents=[
{
"title": "Tall penguins",
"snippet": "Emperor penguins are the tallest.",
},
{
"title": "Penguin habitats",
"snippet": "Emperor penguins only live in Antarctica.",
},
{
"title": "What are animals?",
"snippet": "Animals are different from plants.",
},
],
)
```

**Response**
Expand Down Expand Up @@ -80,12 +91,13 @@ Calling the [Chat API](/reference/chat) with the `search_queries_only` parameter

```python PYTHON
import cohere

co = cohere.Client(api_key="<YOUR API KEY>")

co.chat(
model="command-r-08-2024",
message="Who is more popular: Nsync or Backstreet Boys?",
search_queries_only=True
model="command-r-08-2024",
message="Who is more popular: Nsync or Backstreet Boys?",
search_queries_only=True,
)
```

Expand All @@ -110,23 +122,22 @@ If you are looking for greater control over how search queries are generated, yo
Here, we build a tool that takes a user query and returns a list of relevant document snippets for that query. The tool can generate zero, one or multiple search queries depending on the user query.

```python PYTHON

query_gen_tool = [
{
"name": "internet_search",
"description": "Returns a list of relevant document snippets for a textual query retrieved from the internet",
"parameter_definitions": {
"queries": {
"description": "a list of queries to search the internet with.",
"type": "List[str]",
"required": True
}
{
"name": "internet_search",
"description": "Returns a list of relevant document snippets for a textual query retrieved from the internet",
"parameter_definitions": {
"queries": {
"description": "a list of queries to search the internet with.",
"type": "List[str]",
"required": True,
}
},
}
}
]

instructions = "Write a search query that will find helpful information for answering the user's question accurately. If you need more than one search query, write a list of search queries. If you decide that a search is very unlikely to find information that would be useful in constructing a response to the user, you should instead directly answer."

response = co.chat(
preamble=instructions,
model="command-r-08-2024",
Expand All @@ -152,7 +163,7 @@ You can then customize the preamble and/or the tool definition to generate queri
For example, you can customize the preamble to encourage a longer list of search queries to be generated.

```python PYTHON
instructions_verbose = "Write many search queries that will find helpful information for answering the user's question accurately. Always write a very long list of at least 7 search queries. If you decide that a search is very unlikely to find information that would be useful in constructing a response to the user, you should instead directly answer.
instructions_verbose = "Write many search queries that will find helpful information for answering the user's question accurately. Always write a very long list of at least 7 search queries. If you decide that a search is very unlikely to find information that would be useful in constructing a response to the user, you should instead directly answer."
```
```
# Sample response
Expand Down Expand Up @@ -226,12 +237,14 @@ As an alternative to manually implementing the 3 step workflow, the Chat API off

```python PYTHON
import cohere

co = cohere.Client(api_key="<YOUR API KEY>")

co.chat(
model="command-r-plus-08-2024",
message="Who is more popular: Nsync or Backstreet Boys?",
connectors=[{"id": "web-search"}])
model="command-r-plus-08-2024",
message="Who is more popular: Nsync or Backstreet Boys?",
connectors=[{"id": "web-search"}],
)
```

**Response**
Expand Down
Loading

0 comments on commit db4e025

Please sign in to comment.