Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added support for Azure Openai #69

Open
wants to merge 13 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -278,6 +278,18 @@ ai("Thanks for your help!", tools=[search, lookup])
'tool': None}
```

Azure OpenAI
Here is how Azure open ai can be initialized. Rest of it is all the same as the openai examples given above
```
ai = AIChat(
api_key=[api_key],
api_version=[api_version], # make sure to use the latest api version if you are trying out function calling
model=[chat_engine], # make sure to use a new deployment if you are trying function calling.
api_type="azure"
api_endpoint="your custom azure openapi endpoint goes here. This needs to be the full chat completions URL not just the base URL"
system=[system_instructions])
```

## Miscellaneous Notes

- Like [gpt-2-simple](https://github.com/minimaxir/gpt-2-simple) before it, the primary motivation behind releasing simpleaichat is to both democratize access to ChatGPT even more and also offer more transparency for non-engineers into how Chat AI-based apps work under the hood given the disproportionate amount of media misinformation about their capabilities. This is inspired by real-world experience from [my work with BuzzFeed](https://tech.buzzfeed.com/the-right-tools-for-the-job-c05de96e949e) in the domain, where after spending a long time working with the popular [LangChain](https://github.com/hwchase17/langchain), a more-simple implementation was both much easier to maintain and resulted in much better generations. I began focusing development on simpleaichat after reading a [Hacker News thread](https://news.ycombinator.com/item?id=35820931) filled with many similar complaints, indicating value for an easier-to-use interface for modern AI tricks.
Expand Down
24 changes: 18 additions & 6 deletions simpleaichat/chatgpt.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,11 @@ class ChatGPTSession(ChatSession):
system: str = "You are a helpful assistant."
params: Dict[str, Any] = {"temperature": 0.7}

def __init__(self, **kwargs):
super().__init__(**kwargs)
self.api_type = kwargs.get("api_type", "openai")
self.api_url = kwargs.get("api_endpoint", "https://api.openai.com/v1/chat/completions")

def prepare_request(
self,
prompt: str,
Expand All @@ -29,10 +34,16 @@ def prepare_request(
output_schema: Any = None,
is_function_calling_required: bool = True,
):
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {self.auth['api_key'].get_secret_value()}",
}
if(self.api_type == "azure"):
headers = {
"Content-Type": "application/json",
"api-key": self.auth['api_key'].get_secret_value(),
}
else:
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {self.auth['api_key'].get_secret_value()}",
}

system_message = ChatMessage(role="system", content=system or self.system)
if not input_schema:
Expand All @@ -52,9 +63,10 @@ def prepare_request(
"model": self.model,
"messages": self.format_input_messages(system_message, user_message),
"stream": stream,
**gen_params,
**gen_params
}


# Add function calling parameters if a schema is provided
if input_schema or output_schema:
functions = []
Expand Down Expand Up @@ -158,7 +170,7 @@ def stream(
chunk = chunk[6:] # SSE JSON chunks are prepended with "data: "
if chunk != "[DONE]":
chunk_dict = orjson.loads(chunk)
delta = chunk_dict["choices"][0]["delta"].get("content")
delta = chunk_dict["choices"][0]["delta"].get("content") if len(chunk_dict["choices"]) > 0 else None
if delta:
content.append(delta)
yield {"delta": delta, "response": "".join(content)}
Expand Down
5 changes: 5 additions & 0 deletions simpleaichat/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,11 @@ class ChatSession(BaseModel):
total_completion_length: int = 0
total_length: int = 0
title: Optional[str] = None
api_type:str

def __init__(self, **kwargs: Any) -> None:
super().__init__(**kwargs)
self.api_type = kwargs.get("api_type", "openai")

def __str__(self) -> str:
sess_start_str = self.created_at.strftime("%Y-%m-%d %H:%M:%S")
Expand Down