Skip to content

Commit

Permalink
Update Intro and Installation, added Quickstart (#4)
Browse files Browse the repository at this point in the history
  • Loading branch information
robbenti authored Jun 19, 2024
1 parent a9b015d commit 17901de
Show file tree
Hide file tree
Showing 20 changed files with 141 additions and 65 deletions.
8 changes: 0 additions & 8 deletions docs/docs/community/_category_.json

This file was deleted.

Empty file removed docs/docs/community/community.md
Empty file.
8 changes: 0 additions & 8 deletions docs/docs/contributing/_category_.json

This file was deleted.

Empty file.
8 changes: 0 additions & 8 deletions docs/docs/getting-started/_category_.json

This file was deleted.

8 changes: 0 additions & 8 deletions docs/docs/quickstart-tour/_category_.json

This file was deleted.

Empty file.
43 changes: 16 additions & 27 deletions docs/docs/user-guide/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ sidebar_position: 1
---

# Installation
The platform is composed by different modules
The platform is composed of different modules
* **UI:** the front-end application
* **API:** the back-end application
* **Processing:** the Spark jobs
Expand Down Expand Up @@ -35,39 +35,23 @@ docker compose up
If you want to access the platform's user interface (UI):

```bash
docker compose --profile ui up --force-recreate
docker compose --profile ui up
```

After all containers are up and running, you can access the platform at [http://localhost:5173](http://localhost:5173) to start using it.

*Notes: The `--force-recreate` flag forces Docker Compose to restart all containers, even if running. This is useful when making configuration or image changes and wanting a fresh start. [More info](https://docs.docker.com/reference/cli/docker/compose/up/)*

#### Start from a clean workspace
To ensure a clean environment for running the platform, it's recommended to remove any existing named volumes and container images related to previous runs. You can find detailed information about this process in the [Docker Compose documentation](https://docs.docker.com/reference/cli/docker/compose/down/)

```bash
docker compose down -v --rmi all
```

The `-v` flag is optional but recommended in this case. It removes any named volumes associated with the platform, such as those used for storing data from services like Postgres or Kubernetes. The `--rmi all` flag also removes all images that were defined in the Docker Compose file. By default, `docker-compose down` only removes running containers, so these flags ensure a clean state for starting the platform.
#### Accessing the Kubernetes Cluster
The platform creates a Kubernetes cluster for managing deployments. You can connect and interact with this cluster from your local machine using tools like Lens or `kubectl`.

If you want to delete just volume data run:
In the compose file is present a [k9s](https://k9scli.io/) container that can be used to monitor the k3s cluster.

```bash
docker compose down -v
docker compose up k9s -d && docker attach radicalbit-ai-monitoring-k9s-1
```

#### Accessing the Kubernetes Cluster
The platform creates a Kubernetes cluster for managing deployments. You can connect and interact with this cluster from your local machine using tools like Lens or `kubectl`.

##### Using the kubeconfig File
A file named `kubeconfig.yaml` is automatically generated within the directory `./docker/k3s_data/kubeconfig/` when the platform starts. This file contains sensitive information used to authenticate with the Kubernetes cluster.

##### Security Considerations (Important!)
*Do not modify the original `kubeconfig.yaml` file.* Modifying the server address within the original file can potentially expose the cluster to unauthorized access from outside your local machine.

*Instead, create a copy of the `kubeconfig.yaml` file and modify the copy for local use.* This ensures the original file with the default server address remains secure.

##### Here's how to connect to the cluster:
1. Copy the `kubeconfig.yaml` file to a desired location on your local machine.
1. Edit the copied file and replace the server address `https://k3s:6443` with `https://127.0.0.1:6443`. This points the kubeconfig file to the local Kubernetes cluster running on your machine.
Expand All @@ -77,9 +61,14 @@ A file named `kubeconfig.yaml` is automatically generated within the directory `
In order to use a real AWS instead of MinIO is necessary to modify the environment variables of the api container, putting real `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`, `AWS_REGION` and `S3_BUCKET_NAME` and removing `S3_ENDPOINT_URL`.

#### Teardown
To clean the environment or if something happens and a clean start is needed:
To completely clean up the environment we can use [docker compose](https://docs.docker.com/reference/cli/docker/compose/down/)

* Stop the docker compose
* Remove all containers
* Remove the volume
* Delete the `./docker/k3s_data/kubeconfig` folder
```bash
docker compose --profile ui --profile k9s down -v --remove-orphans
```

To remove everything including container images:

```bash
docker compose --profile ui --profile k9s down -v --remove-orphans --rmi all
```
119 changes: 119 additions & 0 deletions docs/docs/user-guide/quickstart.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,119 @@
---
sidebar_position: 2
---

# Quickstart
This guide provides instructions on how to monitor an AI solution through the Radicalbit AI Platform.

## Monitor an LLM for a Binary Classification
The use case we present here involves the usage of an LLM (powered with RAG) capable of generating an answer to the user's questions in a chatbot for banking services.

### Introduction

The model returns two different outputs:

1. `model_answer`: the answer generated by retrieving similar information
1. `prediction`: a boolean value which indicates if the user's question is pertinent to banking topics.

The reason for this information lies in the fact that by discriminating the textual data into categories, the bank will be able to use only the information related to banking services, to fine-tune the model in a second moment and improve its performance.

### Model Creation
To use the radicalbit-ai-monitoring platform, you need first to prepare your data, which should include the following information:

1. **Features:** The list of variables used by the model to produce the inference. They may include also meta-data (timestamp, log)
2. **Outputs:** The fields returned by the model after the inference. Usually, they are probabilities, a predicted class or number in the case of the classic ML and a generated text in the case of LLMs.
3. **Target**: the ground truth used to validate predictions and evaluate the model quality

This tutorial involves batch monitoring, including the situation where you have some historical data that you want to compare over time.

The **reference dataset** is the name we use to indicate the batch that contains the information we desire to have constantly (or we expect to have) over time. It could be the training set or a chunck of production data where the model has had good performances.

The **current dataset** is the name we use to indicate the batch that contains fresh information, for example, the most recent production data, predictions or ground truths. We expect that it has the same characteristics (statistical properties) as the reference, which indicates that the model has the performance we expect and there is no drift in the data.

What follows is an exemple of data we will use in this tutorial:

| timestamp | user_id | question | model_answer | ground_truth | prediction | gender | age | device | days_as_customer |
|-----------|--------:|:---------|:-------------|-------------:|-----------:|:-------|----:|:-------|-----------------:|
|2024-01-11 08:08:00|user_24:|What documents do I need to open a business account?|You need a valid ID, proof of address, and business registration documents.|1|1|M|44|smartphone|194|
|2024-01-10 03:08:00|user_27|What are the benefits of a premium account?|The benefits of a premium account include higher interest rates and exclusive customer support.|1|1|F|29|tablet|258|
2024-01-11 12:22:00|user_56|How can I check my credit score?|You can check your credit score for free through our mobile app.|1|1|F|44|smartphone|51|
2024-01-10 04:57:00|user_58|Are there any fees for using ATMs?|ATM usage is free of charge at all locations.|1|1|M|50|smartphone|197|

* **timestamp:** it is the time in which the user asks the question
* **user_id:** it is the user identification
* **question:** it is the question asked by the user to the chatbot
* **model_answer:** it is the answer generated by the model
* **ground_truth:** it is the real label where 1 stands for an answer related to banking services and 0 stands for a different topic
* **prediction:** it is the judgment produced by the model about the topic of the answer
* **gender:** it is the user gender
* **age:** it is the user age
* **device:** it is the device used in the current session
* **days_as_customer:** it indicates how many days the user is a customer

### Create the Model
To create a new model, navigate to the *Models* section and click the plus (+) icon.

![Alt text](/img/quickstart/empty-models-list.png "Empty Models List")

The platform should open a modal to allow users to create a new model.

![Alt text](/img/quickstart/new-model-modal-s1.png "New Model")

This modal prompts you to enter the following details:
* **Name:** the name of the model
* **Model type:** the type of the model, in the current platform version there is only available `Binary Classification`
* **Data type:** it explains the data type used by the model
* **Granularity:** the window used to calculate aggregated metrics
* **Framework:** an optional field to describe the frameworks used by the model
* **Algorithm:** an optional field to explain the algorithm used by the model

Please enter the following details and click on the *Next* button:
* **Name:** `LLM-binary-classification`
* **Model type:** `Binary Classification`
* **Data type:** `Tabular`
* **Granularity:** `Hour`

To infer the model schema you've to upload a sample dataset. Please download and use [this reference Comma-Separated Values file](https://github.com/radicalbit/radicalbit-ai-monitoring/blob/9f21c19e97a9dfa51c1bf17002fcdd76d5a5f304/examples/data/df_reference.csv) and click on the *Next* button.

![Alt text](/img/quickstart/new-model-modal-s2.png "Upload CSV file")

Once you've defined the model schema, select the output fields from the variables. Choose `model_answer` and `prediction`, move them to the right, and click on the *Next* button.

![Alt text](/img/quickstart/new-model-modal-s3.png "Output fields selection")

Finally, you need to select and associate the following fields:
* **Target:** the target field or ground truth
* **Timestamp:** the field containing the timestamp value
* **Prediction:** the actual prediction
* **Probability:** the probability score associated with the prediction

Match the following values to their corresponding fields:
* **Target:** `ground_truth`
* **Timestamp:** `timestamp`
* **Prediction:** `prediction`
* **Probability:** leave empty

![Alt text](/img/quickstart/new-model-modal-s4.png "Identify ground truth (target), timestamp, prediction, and probability fields")

Click the *Save Model* button to finalize model creation.

### Model details
Entering into the model details, we can see three different main section:

* **Overview:** this section provides information about the dataset and its schema. You can view a summary, explore the variables (features and ground truth) and the output fields for your model.
* **Reference:** the Reference section displays performance metrics calculated on the imported reference data.
* **Current:** the Current section displays metrics for any user-uploaded data sets you've added in addition to the reference dataset.

#### Import Reference Dataset
To calculate metrics for your reference dataset, import a CSV file.

![Alt text](/img/quickstart/import-reference.png "Import Reference")

Once you initiate the process, the platform will run background jobs to calculate the metrics.

#### Import Current Dataset
To calculate metrics for your current dataset, import a CSV file.

![Alt text](/img/quickstart/import-current.png "Import Current")

Once you initiate the process, the platform will run background jobs to calculate the metrics.
6 changes: 3 additions & 3 deletions docs/docusaurus.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ const VERSION = '0.8.0';
/** @type {import('@docusaurus/types').Config} */
const config = {
title: 'Radicalbit AI Monitoring',
tagline: 'Dinosaurs are cool',
tagline: 'Simpler, Faster, Better MLOps',
favicon: 'img/favicon.ico',

// Set the production url of your site here
Expand All @@ -25,7 +25,7 @@ const config = {
// GitHub pages deployment config.
// If you aren't using GitHub pages, you don't need these.
organizationName: 'Radicalbit', // Usually your GitHub org/user name.
projectName: 'github-pages-test', // Usually your repo name.
projectName: 'radicalbit-ai-monitoring', // Usually your repo name.
deploymentBranch: 'gh-pages',

onBrokenLinks: 'throw',
Expand Down Expand Up @@ -91,7 +91,7 @@ const config = {
dropdownActiveClassDisabled: true,
},
{
href: 'https://github.com/robbenti/github-pages-test',
href: 'https://github.com/radicalbit/radicalbit-ai-monitoring',
label: 'GitHub',
position: 'right',
},
Expand Down
4 changes: 2 additions & 2 deletions docs/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion docs/package.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"name": "github-pages-test",
"name": "radicalbit-ai-monitoring",
"version": "0.0.0",
"private": true,
"scripts": {
Expand Down
Binary file added docs/static/img/favicon.ico
Binary file not shown.
Binary file added docs/static/img/quickstart/empty-models-list.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/static/img/quickstart/import-current.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/static/img/quickstart/import-reference.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 17901de

Please sign in to comment.