Skip to content

Commit

Permalink
fix error in exception (infiniflow#2694)
Browse files Browse the repository at this point in the history
### What problem does this PR solve?
infiniflow#2670

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
  • Loading branch information
KevinHuSh authored Sep 30, 2024
1 parent aa0f6b6 commit f8ced38
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion rag/llm/chat_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -1340,7 +1340,7 @@ def chat(self, system, history, gen_conf):
+ response["usage"]["output_tokens"],
)
except Exception as e:
return ans + "\n**ERROR**: " + str(e), 0
return "\n**ERROR**: " + str(e), 0
else:
self.client._system_instruction = self.system
if "max_tokens" in gen_conf:
Expand Down

0 comments on commit f8ced38

Please sign in to comment.