Skip to content

C4AI/HarpIA_Model_Gateway

Repository files navigation

HarpIA Model Gateway

This repository contains a Python implementation of an answer provider backend server for the HarpIA Ajax Moodle plugin.

Current provider classes:

  • ConstantAnswerProvider: always generates the same answer regardless of the input.
  • EchoAnswerProvider: always echoes the input, optionally converting it to uppercase.
  • GPTAnswerProvider: uses OpenAI API to obtain the answers from GPT models.
  • OllamaAnswerProvider: uses Ollama API to obtain the answers from several local models.

Requirements:

  • Python ≥ 3.11;
  • Docker (recommended).

Usage instructions:

  • Create a configuration file. Create a copy of the config_TEMPLATE.py file and edit it to choose the models that will be provided. Follow the instructions in the file.

  • Build the Docker image:

    docker build -t harpia-model-gateway:1.0 -f containers/prod/Dockerfile .
  • Test an answer provider by interacting with it on a terminal (replace ./config/config1.py with the path to your configuration file, and replace ECHO with the name of the desired model as specified in the configuration):

    docker run --rm -it --name harpia-gateway -v './config/config1.py':/cfg.py harpia-model-gateway:1.0 --config=/cfg.py cli --provider='ECHO'
  • Start the server (replace ./config/config1.py with the path to your configuration file, optionally replace all instances of 42774 with the desired port):

    docker run --rm -it --name harpia-gateway -v './config/config1.py':/cfg.py -p 42774:42774 harpia-model-gateway:1.0 --config=/cfg.py server --host=0.0.0.0 --port=42774 --debug

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published