diff --git a/fern/pages/models/the-command-family-of-models/command-beta.mdx b/fern/pages/models/the-command-family-of-models/command-beta.mdx
index 91d7dd0a..ab18b4d8 100644
--- a/fern/pages/models/the-command-family-of-models/command-beta.mdx
+++ b/fern/pages/models/the-command-family-of-models/command-beta.mdx
@@ -16,12 +16,12 @@ updatedAt: "Tue Jun 04 2024 18:34:22 GMT+0000 (Coordinated Universal Time)"
-| Latest Model | Description | Context Length | Maximum Output Tokens | Endpoints |
-|---------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------|-----------------------|-------------------------------------------------------------------------------------------|
-| `command` | An instruction-following conversational model that performs language tasks with high quality, more reliably and with a longer context than our base generative models. | 4k | 4k | [Chat](/reference/chat),
[Summarize](/reference/summarize) |
-| `command-light` | A smaller, faster version of `command`. Almost as capable, but a lot faster. | 4k | 4k | [Chat](/reference/chat),
[Summarize](/reference/summarize-2) |
-| `command-nightly` | To reduce the time between major releases, we put out nightly versions of command models. For `command`, that is `command-nightly`.
Be advised that `command-nightly` is the latest, most experimental, and (possibly) unstable version of its default counterpart. Nightly releases are updated regularly, without warning, and are not recommended for production use. | 128K | 4k | [Chat](/reference/chat) |
-| `command-light-nightly` | To reduce the time between major releases, we put out nightly versions of command models. For `command-light`, that is `command-light-nightly`.
Be advised that `command-light-nightly` is the latest, most experimental, and (possibly) unstable version of its default counterpart. Nightly releases are updated regularly, without warning, and are not recommended for production use. | 4k | 4k | [Chat](/reference/chat) |
+| Latest Model | Description | Modality | Context Length | Maximum Output Tokens | Endpoints |
+|---------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------|----------------|-----------------------|-------------------------------------------------------------------|
+| `command` | An instruction-following conversational model that performs language tasks with high quality, more reliably and with a longer context than our base generative models. | Text | 4k | 4k | [Chat](/reference/chat),
[Summarize](/reference/summarize) |
+| `command-light` | A smaller, faster version of `command`. Almost as capable, but a lot faster. | Text | 4k | 4k | [Chat](/reference/chat),
[Summarize](/reference/summarize-2)|
+| `command-nightly` | To reduce the time between major releases, we put out nightly versions of command models. For `command`, that is `command-nightly`.
Be advised that `command-nightly` is the latest, most experimental, and (possibly) unstable version of its default counterpart. Nightly releases are updated regularly, without warning, and are not recommended for production use. | Text | 128K | 4k | [Chat](/reference/chat) |
+| `command-light-nightly` | To reduce the time between major releases, we put out nightly versions of command models. For `command-light`, that is `command-light-nightly`.
Be advised that `command-light-nightly` is the latest, most experimental, and (possibly) unstable version of its default counterpart. Nightly releases are updated regularly, without warning, and are not recommended for production use. | Text | 4k | 4k | [Chat](/reference/chat) |
diff --git a/fern/pages/models/the-command-family-of-models/command-r-plus.mdx b/fern/pages/models/the-command-family-of-models/command-r-plus.mdx
index c9919812..a11e123a 100644
--- a/fern/pages/models/the-command-family-of-models/command-r-plus.mdx
+++ b/fern/pages/models/the-command-family-of-models/command-r-plus.mdx
@@ -17,11 +17,11 @@ Command R+ is Cohere's newest large language model, optimized for conversational
We recommend using Command R+ for those workflows that lean on complex RAG functionality and [multi-step tool use (agents)](/docs/multi-hop-tool-use). Command R, on the other hand, is great for simpler [retrieval augmented generation](/docs/retrieval-augmented-generation-rag) (RAG) and [single-step tool use](/docs/tool-use) tasks, as well as applications where price is a major consideration.
### Model Details
-| Model Name | Description | Context Length | Maximum Output Tokens | Endpoints|
-|--------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------|-----------------------|----------|
-| `command-r-plus-08-2024` | `command-r-plus-08-2024` is an update of the Command R+ model, delivered in August 2024. | 128k | 4k | [Chat](/reference/chat) | |
-| `command-r-plus-04-2024` | Command R+ is an instruction-following conversational model that performs language tasks at a higher quality, more reliably, and with a longer context than previous models. It is best suited for complex RAG workflows and multi-step tool use. | 128k | 4k | [Chat](/reference/chat) | |
-| `command-r-plus` | `command-r-plus` is an alias for `command-r-plus-04-2024`, so if you use `command-r-plus` in the API, that's the model you're pointing to. | 128k | 4k | [Chat](/reference/chat) | |
+| Model Name | Description | Modality | Context Length | Maximum Output Tokens | Endpoints |
+|--------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------|----------------|-----------------------|------------------------|
+| `command-r-plus-08-2024` | `command-r-plus-08-2024` is an update of the Command R+ model, delivered in August 2024. | Text | 128k | 4k | [Chat](/reference/chat)|
+| `command-r-plus-04-2024` | Command R+ is an instruction-following conversational model that performs language tasks at a higher quality, more reliably, and with a longer context than previous models. It is best suited for complex RAG workflows and multi-step tool use. | Text | 128k | 4k | [Chat](/reference/chat)|
+| `command-r-plus` | `command-r-plus` is an alias for `command-r-plus-04-2024`, so if you use `command-r-plus` in the API, that's the model you're pointing to. | Text | 128k | 4k | [Chat](/reference/chat)|
## Command R+ August 2024 Release
Cohere's flagship text-generation models, Command R and Command R+, received a substantial update in August 2024. We chose to designate these models with time stamps, so in the API Command R+ 08-2024 is accesible with `command-r-plus-08-2024`.
diff --git a/fern/pages/models/the-command-family-of-models/command-r.mdx b/fern/pages/models/the-command-family-of-models/command-r.mdx
index ed4ef2d4..bc5f6185 100644
--- a/fern/pages/models/the-command-family-of-models/command-r.mdx
+++ b/fern/pages/models/the-command-family-of-models/command-r.mdx
@@ -17,11 +17,11 @@ Command R is a large language model optimized for conversational interaction and
Command R boasts high precision on [retrieval augmented generation](/docs/retrieval-augmented-generation-rag) (RAG) and tool use tasks, low latency and high throughput, a long 128,000-token context length, and strong capabilities across 10 key languages.
### Model Details
-| Model Name | Description | Context Length | Maximum Output Tokens | Endpoints|
-|--------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------|-----------------------|----------|
-| `command-r-08-2024` | `command-r-08-2024` is an update of the Command R model, delivered in August 2024. | 128k | 4k | [Chat](/reference/chat) | |
-| `command-r-03-2024` | Command R is an instruction-following conversational model that performs language tasks at a higher quality, more reliably, and with a longer context than previous models. It can be used for complex workflows like code generation, retrieval augmented generation (RAG), tool use, and agents. | 128k | 4k | [Chat](/reference/chat) | |
-| `command-r` | `command-r` is an alias for `command-r-03-2024`, so if you use `command-r` in the API, that's the model you're pointing to. | 128k | 4k | [Chat](/reference/chat) | |
+| Model Name | Description | Modality | Context Length | Maximum Output Tokens | Endpoints |
+|--------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------|----------------|-----------------------|-------------------------|
+| `command-r-08-2024` | `command-r-08-2024` is an update of the Command R model, delivered in August 2024. | Text | 128k | 4k | [Chat](/reference/chat) | |
+| `command-r-03-2024` | Command R is an instruction-following conversational model that performs language tasks at a higher quality, more reliably, and with a longer context than previous models. It can be used for complex workflows like code generation, retrieval augmented generation (RAG), tool use, and agents. | Text | 128k | 4k | [Chat](/reference/chat) | |
+| `command-r` | `command-r` is an alias for `command-r-03-2024`, so if you use `command-r` in the API, that's the model you're pointing to. | Text | 128k | 4k | [Chat](/reference/chat) | |
## Command R August 2024 Release
Cohere's flagship text-generation models, Command R and Command R+, received a substantial update in August 2024. We chose to designate these models with time stamps, so in the API Command R 08-2024 is accesible with `command-r-08-2024`.