Skip to content

Commit

Permalink
Reshuffle sections
Browse files Browse the repository at this point in the history
  • Loading branch information
Cohee1207 committed Sep 15, 2024
1 parent 0ec09b9 commit 85a0482
Showing 1 changed file with 17 additions and 15 deletions.
32 changes: 17 additions & 15 deletions readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,17 @@ SillyTavern is a passion project brought to you by a dedicated community of LLM
| ![Extensions](/static/screenshot5.jpg) | ![Response Config](/static/screenshot6.jpg) |
| ![Backgrounds](/static/screenshot7.jpg) | ![User Settings](/static/screenshot8.jpg) |

## Installation Requirements

The hardware requirements are minimal: it will run on anything that can run NodeJS 18 or higher. If you intend to do LLM inference on your local machine, we recommend a 3000-series NVIDIA graphics card with at least 6GB of VRAM.

Follow the installation guide for your platform:

* [Windows](https://docs.sillytavern.app/installation/windows/)
* [Linux and Mac](https://docs.sillytavern.app/installation/linuxmacos/)
* [Android](https://docs.sillytavern.app/installation/android-(termux)/)
* [Docker](https://docs.sillytavern.app/installation/docker/)

## Branches

SillyTavern is being developed using a two-branch system to ensure a smooth experience for all users.
Expand All @@ -23,6 +34,10 @@ SillyTavern is being developed using a two-branch system to ensure a smooth expe

Learn more [here](https://docs.sillytavern.app/usage/branches/).

## What do I need other than SillyTavern?

Since SillyTavern is only an interface, you will need access to an LLM backend to provide inference. You can use AI Horde for instant out-of-the-box chatting. Aside from that, we support many other local and cloud-based LLM backends: OpenAI-compatible API, KoboldAI, Tabby, and many more. You can read more about our supported APIs in the [API Connections](https://docs.sillytavern.app/usage/api-connections/) section.

## Character Cards

SillyTavern is built around the concept of "character cards". A character card is a collection of prompts that set the behavior of the LLM and is required to have persistent conversations in SillyTavern. They function similarly to ChatGPT's GPTs or Poe's bots. The content of a character card can be anything: an abstract scenario, an assistant tailored for a specific task, a famous personality or a fictional character.
Expand All @@ -33,6 +48,8 @@ The name field is the only required character card input. To start a neutral con

To get a general idea on how to define character cards, see the default character (Seraphina) or download selected community-made cards from the "Download Extensions & Assets" menu.

You can also create your own character cards from scratch. Refer to the [Character Design](/Usage/Core%20Concepts/characterdesign.md) guide for more information.

## Key Features

* Advanced [text generation settings](/Usage/Core%20Concepts/advancedformatting.md) with many community-made presets
Expand All @@ -54,21 +71,6 @@ SillyTavern has extensibility support.
* [Web Search capabilities for adding additional real world context to your prompts](/extensions/WebSearch.md)
* Many more are available to download from the "Download Extensions & Assets" menu.

## Installation Requirements

The hardware requirements are minimal: it will run on anything that can run NodeJS 18 or higher. If you intend to do LLM inference on your local machine, we recommend a 3000-series NVIDIA graphics card with at least 6GB of VRAM.

Follow the installation guide for your platform:

* [Windows](https://docs.sillytavern.app/installation/windows/)
* [Linux and Mac](https://docs.sillytavern.app/installation/linuxmacos/)
* [Android](https://docs.sillytavern.app/installation/android-(termux)/)
* [Docker](https://docs.sillytavern.app/installation/docker/)

## What do I need other than SillyTavern?

Since SillyTavern is only an interface, you will need access to an LLM backend to provide inference. You can use AI Horde for instant out-of-the-box chatting. Aside from that, we support many other local and cloud-based LLM backends: OpenAI-compatible API, KoboldAI, Tabby, and many more. You can read more about our supported APIs in the [API Connections](https://docs.sillytavern.app/usage/api-connections/) section.

## How can I get in touch with the developers directly?

* Discord: cohee, rossascends, wolfsblvt
Expand Down

0 comments on commit 85a0482

Please sign in to comment.