-
Notifications
You must be signed in to change notification settings - Fork 346
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: 希望后续增加兼容这个API方法 #928
Labels
m: Provider
OpenAI API 或其他 LLM 模型相关
Comments
主要修改代码 async def _make_msg(
self,
chat_completion: typing.Union[chat_completion.ChatCompletion, str], # 支持字符串类型
) -> llm_entities.Message:
if isinstance(chat_completion, str):
# 如果是字符串类型,去掉开头的 'data: ' 部分并处理每个数据块
if chat_completion.startswith("data:"):
# 分割多个 data 部分
parts = chat_completion.split("data:")[1:] # 分割并去掉空数据
combined_message = ""
for part in parts:
part = part.strip() # 去除多余空格
try:
# 尝试解析每个 data 部分
part_data = json.loads(part)
if isinstance(part_data, dict) and 'choices' in part_data:
# 提取有效数据并拼接
choices = part_data['choices']
for choice in choices:
if 'delta' in choice and 'content' in choice['delta']:
combined_message += choice['delta']['content']
else:
raise ValueError("Invalid response structure in part")
except Exception as e:
self.ap.logger.error(f"Failed to parse chat completion part: {e}")
continue # 如果当前部分解析失败,则跳过并继续处理下一个部分
# 将所有内容合并为一个完整的响应
chatcmpl_message = {"content": combined_message, "role": "assistant"}
else:
# 如果响应不是以 'data:' 开头,直接处理
try:
chat_completion = json.loads(chat_completion)
if isinstance(chat_completion, dict) and 'choices' in chat_completion:
chatcmpl_message = chat_completion['choices'][0]['message']
else:
raise ValueError("Invalid response structure: missing 'choices' key")
except Exception as e:
self.ap.logger.error(f"Failed to parse chat completion string: {e}")
raise ValueError(f"Invalid response format: {chat_completion}")
else:
# 如果是 ChatCompletion 对象,直接处理
chatcmpl_message = chat_completion.choices[0].message.dict()
# 确保 'role' 字段存在且不为 None
if 'role' not in chatcmpl_message or chatcmpl_message['role'] is None:
chatcmpl_message['role'] = 'assistant'
message = llm_entities.Message(**chatcmpl_message)
return message |
sanxianxiaohuntun
changed the title
[Feature]: 希望后续增加兼容Nalang电子魅魔API方法
[Feature]: 希望后续增加兼容这个API方法
Nov 21, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
这是一个?
现有功能优化
详细描述
QChatGPT\pkg\provider\modelmgr\apis\chatcmpl.py
兼容后代码
API返回
兼容后可正常运行得出结果
呜~好舒服的抚摸呢。(蹭主人的手)
The text was updated successfully, but these errors were encountered: