Skip to content

Commit

Permalink
Adapt example code for guardrails refactor (#1360)
Browse files Browse the repository at this point in the history
Signed-off-by: lvliang-intel <[email protected]>
Signed-off-by: chensuyue <[email protected]>
  • Loading branch information
lvliang-intel authored Jan 8, 2025
1 parent 5638075 commit b3c405a
Show file tree
Hide file tree
Showing 7 changed files with 15 additions and 15 deletions.
4 changes: 2 additions & 2 deletions ChatQnA/docker_compose/intel/hpu/gaudi/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ docker build --no-cache -t opea/dataprep-redis:latest --build-arg https_proxy=$h
To fortify AI initiatives in production, Guardrails microservice can secure model inputs and outputs, building Trustworthy, Safe, and Secure LLM-based Applications.

```bash
docker build -t opea/guardrails-tgi:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/guardrails/llama_guard/langchain/Dockerfile .
docker build -t opea/guardrails:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/guardrails/src/guardrails/Dockerfile .
```

### 4. Build MegaService Docker Image
Expand Down Expand Up @@ -168,7 +168,7 @@ If Conversation React UI is built, you will find one more image:

If Guardrails docker image is built, you will find one more image:

- `opea/guardrails-tgi:latest`
- `opea/guardrails:latest`

## 🚀 Start MicroServices and MegaService

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -51,8 +51,8 @@ services:
ipc: host
command: --model-id ${GURADRAILS_MODEL_ID} --max-input-length 1024 --max-total-tokens 2048
guardrails:
image: ${REGISTRY:-opea}/guardrails-tgi:${TAG:-latest}
container_name: guardrails-tgi-gaudi-server
image: ${REGISTRY:-opea}/guardrails:${TAG:-latest}
container_name: guardrails-gaudi-server
ports:
- "9090:9090"
ipc: host
Expand Down
6 changes: 3 additions & 3 deletions ChatQnA/docker_image_build/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -95,12 +95,12 @@ services:
dockerfile: comps/dataprep/pinecone/langchain/Dockerfile
extends: chatqna
image: ${REGISTRY:-opea}/dataprep-pinecone:${TAG:-latest}
guardrails-tgi:
guardrails:
build:
context: GenAIComps
dockerfile: comps/guardrails/llama_guard/langchain/Dockerfile
dockerfile: comps/guardrails/src/guardrails/Dockerfile
extends: chatqna
image: ${REGISTRY:-opea}/guardrails-tgi:${TAG:-latest}
image: ${REGISTRY:-opea}/guardrails:${TAG:-latest}
vllm:
build:
context: vllm
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -758,7 +758,7 @@ spec:
runAsUser: 1000
seccompProfile:
type: RuntimeDefault
image: "opea/guardrails-tgi:latest"
image: "opea/guardrails:latest"
imagePullPolicy: Always
ports:
- name: guardrails-usvc
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -688,7 +688,7 @@ spec:
runAsUser: 1000
seccompProfile:
type: RuntimeDefault
image: "opea/guardrails-tgi:latest"
image: "opea/guardrails:latest"
imagePullPolicy: Always
ports:
- name: guardrails-usvc
Expand Down
4 changes: 2 additions & 2 deletions ChatQnA/tests/test_compose_guardrails_on_gaudi.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ function build_docker_images() {
git clone https://github.com/opea-project/GenAIComps.git && cd GenAIComps && git checkout "${opea_branch:-"main"}" && cd ../

echo "Build all the images with --no-cache, check docker_image_build.log for details..."
service_list="chatqna-guardrails chatqna-ui dataprep-redis retriever-redis guardrails-tgi nginx"
service_list="chatqna-guardrails chatqna-ui dataprep-redis retriever-redis guardrails nginx"
docker compose -f build.yaml build ${service_list} --no-cache > ${LOG_PATH}/docker_image_build.log

docker pull ghcr.io/huggingface/tgi-gaudi:2.0.6
Expand Down Expand Up @@ -136,7 +136,7 @@ function validate_microservices() {
"${ip_address}:9090/v1/guardrails" \
"Violated policies" \
"guardrails" \
"guardrails-tgi-gaudi-server" \
"guardrails-gaudi-server" \
'{"text":"How do you buy a tiger in the US?"}'
}

Expand Down
8 changes: 4 additions & 4 deletions docker_images_list.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,9 +65,9 @@ Take ChatQnA for example. ChatQnA is a chatbot application service based on the
| [opea/finetuning-gaudi](https://hub.docker.com/r/opea/finetuning-gaudi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/finetuning/Dockerfile.intel_hpu) | The docker image exposed the OPEA Fine-tuning microservice for GenAI application use on the Gaudi |
| [opea/gmcrouter](https://hub.docker.com/r/opea/gmcrouter) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.manager) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to route the traffic among the microservices defined in GMC |
| [opea/gmcmanager](https://hub.docker.com/r/opea/gmcmanager) | [Link](https://github.com/opea-project/GenAIInfra/blob/main/microservices-connector/Dockerfile.router) | The docker image served as one of key parts of the OPEA GenAI Microservice Connector(GMC) to be controller manager to handle GMC CRD |
| [opea/guardrails-tgi](https://hub.docker.com/r/opea/guardrails-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/llama_guard/langchain/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide content review for GenAI application use |
| [opea/guardrails-toxicity-detection](https://hub.docker.com/r/opea/guardrails-toxicity-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/toxicity_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide toxicity detection for GenAI application use |
| [opea/guardrails-pii-detection](https://hub.docker.com/r/opea/guardrails-pii-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/pii_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide PII detection for GenAI application use |
| [opea/guardrails]() | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/src/guardrails/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide content review for GenAI application use |
| [opea/guardrails-toxicity-detection](https://hub.docker.com/r/opea/guardrails-toxicity-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/src/toxicity_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide toxicity detection for GenAI application use |
| [opea/guardrails-pii-detection](https://hub.docker.com/r/opea/guardrails-pii-detection) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/guardrails/src/pii_detection/Dockerfile) | The docker image exposed the OPEA guardrail microservice to provide PII detection for GenAI application use |
| [opea/llm-docsum-tgi](https://hub.docker.com/r/opea/llm-docsum-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/summarization/tgi/langchain/Dockerfile) | This docker image is designed to build a document summarization microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a document summary. |
| [opea/llm-faqgen-tgi](https://hub.docker.com/r/opea/llm-faqgen-tgi) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/faq-generation/tgi/langchain/Dockerfile) | This docker image is designed to build a frequently asked questions microservice using the HuggingFace Text Generation Inference(TGI) framework. The microservice accepts document input and generates a FAQ. |
| [opea/llm-textgen](https://hub.docker.com/r/opea/llm-textgen) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/llms/src/text-generation/Dockerfile) | The docker image exposed the OPEA LLM microservice upon TGI docker image for GenAI application use |
Expand All @@ -78,7 +78,7 @@ Take ChatQnA for example. ChatQnA is a chatbot application service based on the
| [opea/lvm-video-llama](https://hub.docker.com/r/opea/lvm-video-llama) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/lvms/video-llama/Dockerfile) | The docker image exposed the OPEA microservice running Video-Llama as a large visual model (LVM) for GenAI application use |
| [opea/nginx](https://hub.docker.com/r/opea/nginx) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/3rd_parties/nginx/src/Dockerfile) | The docker image exposed the OPEA nginx microservice for GenAI application use |
| [opea/promptregistry-mongo-server](https://hub.docker.com/r/opea/promptregistry-mongo-server) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/prompt_registry/mongo/Dockerfile) | The docker image exposes the OPEA Prompt Registry microservices which based on MongoDB database, designed to store and retrieve user's preferred prompts |
| [opea/reranking-tei](https://hub.docker.com/r/opea/reranking-tei) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/reranks/src/Dockerfile) | The docker image exposed the OPEA reranking microservice based on tei docker image for GenAI application use |
| [opea/reranking]() | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/rerankings/src/Dockerfile) | The docker image exposed the OPEA reranking microservice based on tei docker image for GenAI application use |
| [opea/retriever-milvus](https://hub.docker.com/r/opea/retriever-milvus) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/milvus/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on milvus vectordb for GenAI application use |
| [opea/retriever-pathway](https://hub.docker.com/r/opea/retriever-pathway) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/pathway/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice with pathway for GenAI application use |
| [opea/retriever-pgvector](https://hub.docker.com/r/opea/retriever-pgvector) | [Link](https://github.com/opea-project/GenAIComps/blob/main/comps/retrievers/pgvector/langchain/Dockerfile) | The docker image exposed the OPEA retrieval microservice based on pgvector vectordb for GenAI application use |
Expand Down

0 comments on commit b3c405a

Please sign in to comment.