From 85a04820bdb0581d387cf6b135d4e193718772b6 Mon Sep 17 00:00:00 2001 From: Cohee <18619528+Cohee1207@users.noreply.github.com> Date: Sun, 15 Sep 2024 20:33:41 +0300 Subject: [PATCH] Reshuffle sections --- readme.md | 32 +++++++++++++++++--------------- 1 file changed, 17 insertions(+), 15 deletions(-) diff --git a/readme.md b/readme.md index 57674b7e4..ea89bb1ca 100644 --- a/readme.md +++ b/readme.md @@ -14,6 +14,17 @@ SillyTavern is a passion project brought to you by a dedicated community of LLM | ![Extensions](/static/screenshot5.jpg) | ![Response Config](/static/screenshot6.jpg) | | ![Backgrounds](/static/screenshot7.jpg) | ![User Settings](/static/screenshot8.jpg) | +## Installation Requirements + +The hardware requirements are minimal: it will run on anything that can run NodeJS 18 or higher. If you intend to do LLM inference on your local machine, we recommend a 3000-series NVIDIA graphics card with at least 6GB of VRAM. + +Follow the installation guide for your platform: + +* [Windows](https://docs.sillytavern.app/installation/windows/) +* [Linux and Mac](https://docs.sillytavern.app/installation/linuxmacos/) +* [Android](https://docs.sillytavern.app/installation/android-(termux)/) +* [Docker](https://docs.sillytavern.app/installation/docker/) + ## Branches SillyTavern is being developed using a two-branch system to ensure a smooth experience for all users. @@ -23,6 +34,10 @@ SillyTavern is being developed using a two-branch system to ensure a smooth expe Learn more [here](https://docs.sillytavern.app/usage/branches/). +## What do I need other than SillyTavern? + +Since SillyTavern is only an interface, you will need access to an LLM backend to provide inference. You can use AI Horde for instant out-of-the-box chatting. Aside from that, we support many other local and cloud-based LLM backends: OpenAI-compatible API, KoboldAI, Tabby, and many more. You can read more about our supported APIs in the [API Connections](https://docs.sillytavern.app/usage/api-connections/) section. + ## Character Cards SillyTavern is built around the concept of "character cards". A character card is a collection of prompts that set the behavior of the LLM and is required to have persistent conversations in SillyTavern. They function similarly to ChatGPT's GPTs or Poe's bots. The content of a character card can be anything: an abstract scenario, an assistant tailored for a specific task, a famous personality or a fictional character. @@ -33,6 +48,8 @@ The name field is the only required character card input. To start a neutral con To get a general idea on how to define character cards, see the default character (Seraphina) or download selected community-made cards from the "Download Extensions & Assets" menu. +You can also create your own character cards from scratch. Refer to the [Character Design](/Usage/Core%20Concepts/characterdesign.md) guide for more information. + ## Key Features * Advanced [text generation settings](/Usage/Core%20Concepts/advancedformatting.md) with many community-made presets @@ -54,21 +71,6 @@ SillyTavern has extensibility support. * [Web Search capabilities for adding additional real world context to your prompts](/extensions/WebSearch.md) * Many more are available to download from the "Download Extensions & Assets" menu. -## Installation Requirements - -The hardware requirements are minimal: it will run on anything that can run NodeJS 18 or higher. If you intend to do LLM inference on your local machine, we recommend a 3000-series NVIDIA graphics card with at least 6GB of VRAM. - -Follow the installation guide for your platform: - -* [Windows](https://docs.sillytavern.app/installation/windows/) -* [Linux and Mac](https://docs.sillytavern.app/installation/linuxmacos/) -* [Android](https://docs.sillytavern.app/installation/android-(termux)/) -* [Docker](https://docs.sillytavern.app/installation/docker/) - -## What do I need other than SillyTavern? - -Since SillyTavern is only an interface, you will need access to an LLM backend to provide inference. You can use AI Horde for instant out-of-the-box chatting. Aside from that, we support many other local and cloud-based LLM backends: OpenAI-compatible API, KoboldAI, Tabby, and many more. You can read more about our supported APIs in the [API Connections](https://docs.sillytavern.app/usage/api-connections/) section. - ## How can I get in touch with the developers directly? * Discord: cohee, rossascends, wolfsblvt