Skip to content

Commit

Permalink
Merge branch 'main' of github.com:radicalbit/radicalbit-ai-monitoring…
Browse files Browse the repository at this point in the history
… into feature/ROS-265-add-latest-reference-and-current-uuids-to-modelout
  • Loading branch information
dtria91 committed Jun 25, 2024
2 parents 24fa036 + c6a8f14 commit f5ab184
Show file tree
Hide file tree
Showing 5 changed files with 98 additions and 5 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/radicalbit-bot.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ on:
issue_comment:
types: [created, edited, deleted]
env:
AVAILABLE_COMMANDS: '["/build-api","/build-spark","/build-ui","/build-migrations","build-all"]'
AVAILABLE_COMMANDS: '["/build-api","/build-spark","/build-ui","/build-migrations","/build-all"]'

jobs:
build-api:
Expand Down
60 changes: 56 additions & 4 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,64 @@
# Contribute to this project
In order to contribute to this project please follow all guidelines here reported

# Commit messages conventions
This project follows the [conventional commits](https://www.conventionalcommits.org/it/v1.0.0/) specification.
🎉 Thank you for considering contributing to **Radicalbit AI Monitoring** 🎉!

We welcome contributions from developers like you. To ensure a smooth contribution process, please follow the guidelines below.

## Commit messages conventions
This project follows the [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/) specification.

Steps to follow:

1. Install [pre-commit](https://pre-commit.com/) because we'll use it as our main linter. It's completely plugin-based, you con follow [these instructions](https://pre-commit.com/#installation) to install it onto your machine.

2. Install and configure the [conventional-pre-commit](https://github.com/compilerla/conventional-pre-commit) plugin that allows to use the linter to enforce the usage of conventional commits.
This repo is already provided with a proper configuration of the plugin, see setup instructions [here](https://github.com/compilerla/conventional-pre-commit?tab=readme-ov-file#usage)
This repo is already provided with a proper configuration of the plugin, see setup instructions [here](https://github.com/compilerla/conventional-pre-commit?tab=readme-ov-file#usage)

## Development Environment Setup

In this section we will show how to set up the development environment for the project.

For more info about the development of the ui and api, see [ui](./ui/README.md) and [api](./api/README.md) README.md files.

### Prerequisites

Make sure you have Docker and Docker Compose installed on your machine.

We use [docker compose watch](https://docs.docker.com/compose/file-watch/) as our development environment, so be sure docker compose `2.22.0` or greater is installed.

### Start the Environment

Run `docker compose --profile ui up --watch` to run the app in DEV mode.

By running the above command the following containers are started:

1. **ui**: nginx container with the ui built using yarn (see [Dockerfile](./ui/Dockerfile))
1. **api**: FastAPI application server (see [Dockerfile](./api/Dockerfile))
1. **migrations**: alembic container to manage database migrations (see [Dockerfile](./api/migrations.Dockerfile)). For more info on how to create new migrations please refer to [api README.md file](./api/README.md#generate-a-new-migration)
1. **k3s**: K3s cluster where spark jobs are executed
1. **postgres**: PostgreSQL Database
1. **minio**: s3 compatible object storage
1. **adminer**: to interact with the database if needed
1. **createbucket**: container to create the default bucket in minio

Once all the containers are up & running you can make any changes to the api and/or the ui folders and docker compose will restart all the modified containers.

## Code Style

We use [Ruff](https://docs.astral.sh/ruff/) as our main python linter and formatter, and [ESLint](https://eslint.org/) for *Javascript* code.

## Issues and Bugs

If you find any issues or bugs, please open a GitHub issue with a detailed description of the problem and steps to reproduce it.

## Pull Requests

1. Fork the repository and create a new branch for your feature or bug fix
1. Make your changes and ensure all tests pass
1. Open a pull request with a clear title and description
1. Be ready to address any feedback or comments during the code review process

*Notes on Pull Requests:*

- We [check](./.github/workflows/semantic-pr.yaml) that all pull request titles follow the conventional commit format.
- All docker image build on pull request events are disabled by default. You can run the build if needed by commenting on the pull request with a command (available commands are: `/build-api`, `/build-spark`, `/build-ui`, `/build-migrations`, `/build-all`).
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@
![GitHub Release](https://img.shields.io/github/v/release/radicalbit/radicalbit-ai-monitoring)
![GitHub License](https://img.shields.io/github/license/radicalbit/radicalbit-ai-monitoring)
![Discord](https://img.shields.io/discord/1252978922962817034)
[![Security Scan](https://img.shields.io/github/actions/workflow/status/radicalbit/radicalbit-ai-monitoring/trivy-scan.yaml?branch=main&label=Security%20Scan
)](./.github/workflows/trivy-scan.yaml)

# Radicalbit AI Monitoring

Expand Down
36 changes: 36 additions & 0 deletions sdk/radicalbit_platform_sdk/apis/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -181,6 +181,15 @@ def load_reference_dataset(
if aws_credentials is None
else aws_credentials.default_region
),
endpoint_url=(
None
if aws_credentials is None
else (
None
if aws_credentials.endpoint_url is None
else aws_credentials.endpoint_url
)
),
)

s3_client.upload_file(
Expand Down Expand Up @@ -237,6 +246,15 @@ def bind_reference_dataset(
region_name=(
None if aws_credentials is None else aws_credentials.default_region
),
endpoint_url=(
None
if aws_credentials is None
else (
None
if aws_credentials.endpoint_url is None
else aws_credentials.endpoint_url
)
),
)

chunks_iterator = s3_client.get_object(
Expand Down Expand Up @@ -314,6 +332,15 @@ def load_current_dataset(
if aws_credentials is None
else aws_credentials.default_region
),
endpoint_url=(
None
if aws_credentials is None
else (
None
if aws_credentials.endpoint_url is None
else aws_credentials.endpoint_url
)
),
)

s3_client.upload_file(
Expand Down Expand Up @@ -372,6 +399,15 @@ def bind_current_dataset(
region_name=(
None if aws_credentials is None else aws_credentials.default_region
),
endpoint_url=(
None
if aws_credentials is None
else (
None
if aws_credentials.endpoint_url is None
else aws_credentials.endpoint_url
)
),
)

chunks_iterator = s3_client.get_object(
Expand Down
3 changes: 3 additions & 0 deletions sdk/radicalbit_platform_sdk/models/aws_credentials.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,10 @@
from typing import Optional

from pydantic import BaseModel


class AwsCredentials(BaseModel):
access_key_id: str
secret_access_key: str
default_region: str
endpoint_url: Optional[str]

0 comments on commit f5ab184

Please sign in to comment.