diff --git a/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md b/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md index 5533f0750..bb9239abf 100644 --- a/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md +++ b/ProductivitySuite/docker_compose/intel/cpu/xeon/README.md @@ -293,7 +293,7 @@ Please refer to **[keycloak_setup_guide](keycloak_setup_guide.md)** for more det 10. DocSum LLM Microservice ```bash - curl http://${host_ip}:9002/v1/chat/docsum\ + curl http://${host_ip}:9003/v1/chat/docsum\ -X POST \ -d '{"query":"Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5"}' \ -H 'Content-Type: application/json' @@ -302,7 +302,7 @@ Please refer to **[keycloak_setup_guide](keycloak_setup_guide.md)** for more det 11. FAQGen LLM Microservice ```bash - curl http://${host_ip}:9003/v1/faqgen\ + curl http://${host_ip}:9002/v1/faqgen\ -X POST \ -d '{"query":"Text Embeddings Inference (TEI) is a toolkit for deploying and serving open source text embeddings and sequence classification models. TEI enables high-performance extraction for the most popular models, including FlagEmbedding, Ember, GTE and E5"}' \ -H 'Content-Type: application/json'