Skip to content

Commit

Permalink
[pre-commit.ci] auto fixes from pre-commit.com hooks
Browse files Browse the repository at this point in the history
for more information, see https://pre-commit.ci

Signed-off-by: Juan Maturino <[email protected]>
  • Loading branch information
pre-commit-ci[bot] authored and jjmaturino committed Dec 18, 2024
1 parent b26cea8 commit 4cd8630
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 9 deletions.
5 changes: 3 additions & 2 deletions comps/llms/summarization/predictionguard/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,17 @@

# Getting Started

## 🚀1. Start Microservice with Docker 🐳
## 🚀1. Start Microservice with Docker 🐳

### 1.1 Set up Prediction Guard API Key

You can get your API key from the [Prediction Guard Discord channel](https://discord.gg/TFHgnhAFKd).

```bash
export PREDICTIONGUARD_API_KEY=<your_api_key>
```

### 1.2 Build Docker Image
### 1.2 Build Docker Image

```bash
docker build -t opea/llm-docsum-predictionguard:latest -f comps/llms/summarization/predictionguard/Dockerfile .
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@
from fastapi.responses import StreamingResponse
from predictionguard import PredictionGuard


from comps import (
GeneratedDoc,
LLMParamsDoc,
Expand Down Expand Up @@ -74,12 +73,7 @@ async def stream_generator():
top_k=input.top_k,
)

print(json.dumps(
response,
sort_keys=True,
indent=4,
separators=(',', ': ')
))
print(json.dumps(response, sort_keys=True, indent=4, separators=(",", ": ")))

response_text = response["choices"][0]["message"]["content"]
except Exception as e:
Expand Down

0 comments on commit 4cd8630

Please sign in to comment.