Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

doc: openAIChat #63

Closed
wants to merge 54 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
54 commits
Select commit Hold shift + click to select a range
8da8794
Adding image support to ollamaChat
ccreutzi Jul 25, 2024
056c557
Add vision support to the doc files
ccreutzi Jul 25, 2024
b9922f0
Merge branch 'main' into ollama-images
ccreutzi Jul 25, 2024
ed2aa22
Preinstall bakllava
ccreutzi Jul 25, 2024
d24fa00
Text only edits to error message catalog (doc review)
MiriamScharnke Jul 25, 2024
1701188
Add doc warning about Ollama image input for text-only models
ccreutzi Jul 26, 2024
bb4ba1a
Update doc/Ollama.md
ccreutzi Jul 26, 2024
7aaadb4
WARNING -> TIP
ccreutzi Jul 26, 2024
7bed93d
Update +llms/+utils/errorMessageCatalog.m
MiriamScharnke Jul 26, 2024
83eff3a
Update +llms/+utils/errorMessageCatalog.m
MiriamScharnke Jul 26, 2024
1384590
Moved mlx into sub-directory, added Markdown export
ccreutzi Jul 29, 2024
c31e76e
Correct mlx path in texampleTests.m
ccreutzi Jul 29, 2024
570aaa1
Inserted text should have MATLAB®
ccreutzi Jul 29, 2024
50b9b0f
Also ignore data directory from moved mlx files
ccreutzi Jul 29, 2024
581a4a2
Add ® in generated files
ccreutzi Jul 29, 2024
99c0902
Merge branch 'generate-md-from-mlx' of github.com:matlab-deep-learnin…
ccreutzi Jul 29, 2024
a7b0627
Create functions directory in the doc folder and add openAIChat docum…
MiriamScharnke Jul 29, 2024
c3bec04
Merge pull request #59 from matlab-deep-learning/ollama-images
ccreutzi Jul 29, 2024
19c70ee
Update doc/functions/openAIChat.md
MiriamScharnke Jul 29, 2024
56a65e7
Update doc/functions/openAIChat.md
MiriamScharnke Jul 29, 2024
b0f15e2
Update doc/functions/openAIChat.md
MiriamScharnke Jul 29, 2024
ada0f7c
Merge pull request #62 from matlab-deep-learning/generate-md-from-mlx
ccreutzi Jul 30, 2024
1ed0f7c
Apply suggestions from code review
MiriamScharnke Jul 30, 2024
595316f
Update doc/functions/openAIChat.md
MiriamScharnke Jul 30, 2024
ada0ff8
Update openAIChat documentation.
MiriamScharnke Jul 30, 2024
8426bb7
Merge branch 'temp-change' into doc-openaichat
MiriamScharnke Jul 30, 2024
ddcf6f1
Merge pull request #60 from matlab-deep-learning/error-message-text-o…
MiriamScharnke Jul 31, 2024
72d171e
Fix OpenAI trademarks
MiriamScharnke Aug 1, 2024
223347b
Delete doc/functions/openAIChat.md
MiriamScharnke Aug 1, 2024
ae07cd9
Merge pull request #66 from matlab-deep-learning/fix-openai-trademark
MiriamScharnke Aug 1, 2024
b272344
Technical feedback and cosmetic changes for openAIChat reference page
MiriamScharnke Aug 1, 2024
ae46982
Cosmetic changes for openAIChat reference page
MiriamScharnke Aug 1, 2024
a7c469a
Fix trademark on openAIChat reference page
MiriamScharnke Aug 1, 2024
57b22e1
Allow (long) char vectors for StopSequences
ccreutzi Aug 1, 2024
4563e70
Cosmetic fixes
MiriamScharnke Aug 1, 2024
ea8874b
Merge pull request #67 from matlab-deep-learning/allow-char-StopSeque…
ccreutzi Aug 2, 2024
172a4da
Replace bakllava by moondream
ccreutzi Aug 2, 2024
5fa9534
Create CODEOWNERS
ccreutzi Aug 2, 2024
e72cc99
Merge pull request #68 from matlab-deep-learning/replace-bakllava
ccreutzi Aug 2, 2024
062538c
Merge pull request #69 from matlab-deep-learning/codeowners
ccreutzi Aug 2, 2024
2565f27
Trace/replay `llms.internal.sendRequest`
ccreutzi Aug 5, 2024
53c1df5
Avoid bogus access to `json.choices.delta.content`
ccreutzi Aug 5, 2024
fce8bb8
Merge pull request #71 from matlab-deep-learning/avoid-content-error
ccreutzi Aug 5, 2024
2cbda4d
Merge pull request #70 from matlab-deep-learning/test-doubles
ccreutzi Aug 6, 2024
a1b48e1
Cosmetic fixes
MiriamScharnke Aug 6, 2024
bee4e39
Update doc/functions/openAIChat.md
MiriamScharnke Jul 30, 2024
e8f20cd
Technical feedback and cosmetic changes for openAIChat reference page
MiriamScharnke Aug 1, 2024
356bcd4
Cosmetic changes for openAIChat reference page
MiriamScharnke Aug 1, 2024
4b68fc7
Fix trademark on openAIChat reference page
MiriamScharnke Aug 1, 2024
07bef3c
Cosmetic fixes
MiriamScharnke Aug 1, 2024
963a6d9
Cosmetic fixes
MiriamScharnke Aug 6, 2024
874536b
Cosmetic fixes
MiriamScharnke Aug 6, 2024
3da3ae6
Merge branch 'doc-openaichat' of github.com:matlab-deep-learning/llms…
MiriamScharnke Aug 6, 2024
c357e75
Merge remote-tracking branch 'origin' into doc-openaichat
MiriamScharnke Aug 6, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion +llms/+internal/callAzureChatAPI.m
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@

parameters = buildParametersCall(messages, functions, nvp);

[response, streamedText] = llms.internal.sendRequest(parameters,nvp.APIKey, URL, nvp.TimeOut, nvp.StreamFun);
[response, streamedText] = llms.internal.sendRequestWrapper(parameters,nvp.APIKey, URL, nvp.TimeOut, nvp.StreamFun);

% If call errors, "choices" will not be part of response.Body.Data, instead
% we get response.Body.Data.error
Expand Down
2 changes: 1 addition & 1 deletion +llms/+internal/callOllamaChatAPI.m
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@

parameters = buildParametersCall(model, messages, nvp);

[response, streamedText] = llms.internal.sendRequest(parameters,[],URL,nvp.TimeOut,nvp.StreamFun);
[response, streamedText] = llms.internal.sendRequestWrapper(parameters,[],URL,nvp.TimeOut,nvp.StreamFun);

% If call errors, "choices" will not be part of response.Body.Data, instead
% we get response.Body.Data.error
Expand Down
2 changes: 1 addition & 1 deletion +llms/+internal/callOpenAIChatAPI.m
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@

parameters = buildParametersCall(messages, functions, nvp);

[response, streamedText] = llms.internal.sendRequest(parameters,nvp.APIKey, END_POINT, nvp.TimeOut, nvp.StreamFun);
[response, streamedText] = llms.internal.sendRequestWrapper(parameters,nvp.APIKey, END_POINT, nvp.TimeOut, nvp.StreamFun);

% If call errors, "choices" will not be part of response.Body.Data, instead
% we get response.Body.Data.error
Expand Down
5 changes: 5 additions & 0 deletions +llms/+internal/sendRequestWrapper.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
function [response, streamedText] = sendRequestWrapper(varargin)
% This function is undocumented and will change in a future release

% A wrapper around sendRequest to have a test seam
[response, streamedText] = llms.internal.sendRequest(varargin{:});
6 changes: 6 additions & 0 deletions +llms/+internal/textGenerator.m
Original file line number Diff line number Diff line change
Expand Up @@ -28,4 +28,10 @@
properties (Access=protected)
StreamFun
end

methods
function hObj = set.StopSequences(hObj,value)
hObj.StopSequences = string(value);
end
end
end
3 changes: 2 additions & 1 deletion +llms/+stream/responseStreamer.m
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,8 @@
end
this.StreamFun('');
this.ResponseText = txt;
else
elseif isfield(json.choices,"delta") && ...
isfield(json.choices.delta,"content")
txt = json.choices.delta.content;
this.StreamFun(txt);
this.ResponseText = [this.ResponseText txt];
Expand Down
22 changes: 11 additions & 11 deletions +llms/+utils/errorMessageCatalog.m
Original file line number Diff line number Diff line change
Expand Up @@ -41,22 +41,22 @@
catalog("llms:mustBeAssistantWithIdAndFunction") = "Field 'tool_call' must be a struct with fields 'id' and 'function'.";
catalog("llms:mustBeAssistantWithNameAndArguments") = "Field 'function' must be a struct with fields 'name' and 'arguments'.";
catalog("llms:assistantMustHaveTextNameAndArguments") = "Fields 'name' and 'arguments' must be text with one or more characters.";
catalog("llms:mustBeValidIndex") = "Index exceeds the number of array elements. Index must be less than or equal to ({1}).";
catalog("llms:stopSequencesMustHaveMax4Elements") = "Number of elements must not be larger than 4.";
catalog("llms:mustBeValidIndex") = "Index exceeds the number of array elements. Index must be less than or equal to {1}.";
catalog("llms:stopSequencesMustHaveMax4Elements") = "Number of stop sequences must be less than or equal to 4.";
catalog("llms:endpointMustBeSpecified") = "Unable to find endpoint. Either set environment variable AZURE_OPENAI_ENDPOINT or specify name-value argument ""Endpoint"".";
catalog("llms:deploymentMustBeSpecified") = "Unable to find deployment name. Either set environment variable AZURE_OPENAI_DEPLOYMENT or specify name-value argument ""Deployment"".";
catalog("llms:keyMustBeSpecified") = "Unable to find API key. Either set environment variable {1} or specify name-value argument ""APIKey"".";
catalog("llms:mustHaveMessages") = "Value must contain at least one message in Messages.";
catalog("llms:mustHaveMessages") = "Message history must not be empty.";
catalog("llms:mustSetFunctionsForCall") = "When no functions are defined, ToolChoice must not be specified.";
catalog("llms:mustBeMessagesOrTxt") = "Messages must be text with one or more characters or a messageHistory object.";
catalog("llms:invalidOptionAndValueForModel") = "'{1}' with value '{2}' is not supported for ModelName '{3}'";
catalog("llms:invalidOptionForModel") = "{1} is not supported for ModelName '{2}'";
catalog("llms:invalidContentTypeForModel") = "{1} is not supported for ModelName '{2}'";
catalog("llms:functionNotAvailableForModel") = "This function is not supported for ModelName '{1}'";
catalog("llms:promptLimitCharacter") = "Prompt must have a maximum length of {1} characters for ModelName '{2}'";
catalog("llms:pngExpected") = "Argument must be a PNG image.";
catalog("llms:mustBeMessagesOrTxt") = "Message must be nonempty string, character array, cell array of character vectors, or messageHistory object.";
catalog("llms:invalidOptionAndValueForModel") = "'{1}' with value '{2}' is not supported for model ""{3}"".";
catalog("llms:invalidOptionForModel") = "Invalid argument name {1} for model ""{2}"".";
catalog("llms:invalidContentTypeForModel") = "{1} is not supported for model ""{2}"".";
catalog("llms:functionNotAvailableForModel") = "Image editing is not supported for model ""{1}"".";
catalog("llms:promptLimitCharacter") = "Prompt must contain at most {1} characters for model ""{2}"".";
catalog("llms:pngExpected") = "Image must be a PNG file (*.png).";
catalog("llms:warningJsonInstruction") = "When using JSON mode, you must also prompt the model to produce JSON yourself via a system or user message.";
catalog("llms:apiReturnedError") = "Server error: ""{1}""";
catalog("llms:apiReturnedError") = "Server returned error indicating: ""{1}""";
catalog("llms:dimensionsMustBeSmallerThan") = "Dimensions must be less than or equal to {1}.";
catalog("llms:stream:responseStreamer:InvalidInput") = "Input does not have the expected json format, got ""{1}"".";
end
1 change: 1 addition & 0 deletions +llms/+utils/mustBeValidStop.m
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ function mustBeValidStop(value)
if ~isempty(value)
mustBeVector(value);
mustBeNonzeroLengthText(value);
value = string(value);
% This restriction is set by the OpenAI API
if numel(value)>4
error("llms:stopSequencesMustHaveMax4Elements", llms.utils.errorMessageCatalog.getMessage("llms:stopSequencesMustHaveMax4Elements"));
Expand Down
34 changes: 34 additions & 0 deletions .githooks/pre-commit
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
#!/bin/bash

cd $(git rev-parse --show-toplevel)
pwd

# For all commits of mlx files, create corresponding Markdown (md) files.
# If the mlx files are in .../mlx-scripts/*.mlx, the corresponding
# md files will go into .../*.md.
#
# This script assumes that the mlx files as currently in the file system
# are what is being committed, instead of doing a lot of extra work to
# get them from the stage area.
#
# Note that this script will not remove media files. If an mlx has
# fewer plots at some point in the future, there will be file system
# cruft. Which doesn't hurt the md display in GitHub or elswehere.
changedMlxFiles=`git diff --cached --name-only --diff-filter=d '*.mlx'`

if [ -n "$changedMlxFiles" ]; then
# Keep the line break here, we replace end-of-line with "' '" to get the quotes right
matlab -batch "for file = {'${changedMlxFiles//
/' '}'}, export(file{1},replace(erase(file{1},'mlx-scripts'),'.mlx','.md')); end"
tmp=${changedMlxFiles//mlx-scripts\//}
mdFiles=${tmp//.mlx/.md}
for file in $mdFiles; do
if [ -d ${file%.md}_media ]; then
git add ${file%.md}_media/
fi
perl -pi -e "\$cnt++ if /^#/; " \
-e "\$_ .= \"\nTo run the code shown on this page, open the MLX file in MATLAB®: [mlx-scripts/$(basename $file .md).mlx](mlx-scripts/$(basename $file .md).mlx) \n\" if /^#/ && \$cnt==1;" \
$file
done
git add $mdFiles
fi
4 changes: 4 additions & 0 deletions .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
# Code owners, to get auto-filled reviewer lists

# To start with, we just assume everyone in the core team is included on all reviews
* @adulai @ccreutzi @debymf @MiriamScharnke @vpapanasta
1 change: 1 addition & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ jobs:
- name: Pull models
run: |
ollama pull mistral
ollama pull moondream
OLLAMA_HOST=127.0.0.1:11435 ollama pull qwen2:0.5b
- name: Set up MATLAB
uses: matlab-actions/setup-matlab@v2
Expand Down
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,10 +1,12 @@
*.env
*.asv
*.mat
!tests/recordings/*.mat
startup.m
papers_to_read.csv
data/*
examples/data/*
examples/mlx-scripts/data/*
._*
.nfs*
.DS_Store
11 changes: 11 additions & 0 deletions DEVELOPMENT.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# Notes for Developers

Nothing in this file should be required knowledge to use the repository. These are notes for people actually making changes that are going to be submitted and incorporated into the main branch.

## Git Hooks

After checkout, link or (on Windows) copy the files from `.githooks` into the local `.git/hooks` folder:

```
(cd .git/hooks/; ln -s ../../.githooks/pre-commit .)
```
24 changes: 12 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Large Language Models (LLMs) with MATLAB®
# Large Language Models (LLMs) with MATLAB

[![Open in MATLAB Online](https://www.mathworks.com/images/responsive/global/open-in-matlab-online.svg)](https://matlab.mathworks.com/open/github/v1?repo=matlab-deep-learning/llms-with-matlab) [![View Large Language Models (LLMs) with MATLAB on File Exchange](https://www.mathworks.com/matlabcentral/images/matlab-file-exchange.svg)](https://www.mathworks.com/matlabcentral/fileexchange/163796-large-language-models-llms-with-matlab)

This repository contains code to connect MATLAB to the [OpenAI Chat Completions API](https://platform.openai.com/docs/guides/text-generation/chat-completions-api) (which powers ChatGPT™), OpenAI Images API (which powers DALL·E™), [Azure® OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-services/openai/), and both local and nonlocal [Ollama™](https://ollama.com/) models. This allows you to leverage the natural language processing capabilities of large language models directly within your MATLAB environment.
This repository contains code to connect MATLAB® to the [OpenAI® Chat Completions API](https://platform.openai.com/docs/guides/text-generation/chat-completions-api) (which powers ChatGPT™), OpenAI Images API (which powers DALL·E™), [Azure® OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-services/openai/), and both local and nonlocal [Ollama™](https://ollama.com/) models. This allows you to leverage the natural language processing capabilities of large language models directly within your MATLAB environment.

## Requirements

Expand Down Expand Up @@ -52,16 +52,16 @@ To use this repository with a local installation of MATLAB, first clone the repo
## Examples
To learn how to use this in your workflows, see [Examples](/examples/).

- [ProcessGeneratedTextinRealTimebyUsingChatGPTinStreamingMode.mlx](/examples/ProcessGeneratedTextinRealTimebyUsingChatGPTinStreamingMode.mlx): Learn to implement a simple chat that stream the response.
- [SummarizeLargeDocumentsUsingChatGPTandMATLAB.mlx](/examples/SummarizeLargeDocumentsUsingChatGPTandMATLAB.mlx): Learn to create concise summaries of long texts with ChatGPT. (Requires Text Analytics Toolbox™)
- [CreateSimpleChatBot.mlx](/examples/CreateSimpleChatBot.mlx): Build a conversational chatbot capable of handling various dialogue scenarios using ChatGPT. (Requires Text Analytics Toolbox)
- [AnalyzeScientificPapersUsingFunctionCalls.mlx](/examples/AnalyzeScientificPapersUsingFunctionCalls.mlx): Learn how to create agents capable of executing MATLAB functions.
- [AnalyzeTextDataUsingParallelFunctionCallwithChatGPT.mlx](/examples/AnalyzeTextDataUsingParallelFunctionCallwithChatGPT.mlx): Learn how to take advantage of parallel function calling.
- [RetrievalAugmentedGenerationUsingChatGPTandMATLAB.mlx](/examples/RetrievalAugmentedGenerationUsingChatGPTandMATLAB.mlx): Learn about retrieval augmented generation with a simple use case. (Requires Text Analytics Toolbox™)
- [DescribeImagesUsingChatGPT.mlx](/examples/DescribeImagesUsingChatGPT.mlx): Learn how to use GPT-4 Turbo with Vision to understand the content of an image.
- [AnalyzeSentimentinTextUsingChatGPTinJSONMode.mlx](/examples/AnalyzeSentimentinTextUsingChatGPTinJSONMode.mlx): Learn how to use JSON mode in chat completions
- [UsingDALLEToEditImages.mlx](/examples/UsingDALLEToEditImages.mlx): Learn how to generate images
- [UsingDALLEToGenerateImages.mlx](/examples/UsingDALLEToGenerateImages.mlx): Create variations of images and editimages.
- [ProcessGeneratedTextinRealTimebyUsingChatGPTinStreamingMode.md](/examples/ProcessGeneratedTextinRealTimebyUsingChatGPTinStreamingMode.md): Learn to implement a simple chat that stream the response.
- [SummarizeLargeDocumentsUsingChatGPTandMATLAB.md](/examples/SummarizeLargeDocumentsUsingChatGPTandMATLAB.md): Learn to create concise summaries of long texts with ChatGPT. (Requires Text Analytics Toolbox™)
- [CreateSimpleChatBot.md](/examples/CreateSimpleChatBot.md): Build a conversational chatbot capable of handling various dialogue scenarios using ChatGPT. (Requires Text Analytics Toolbox)
- [AnalyzeScientificPapersUsingFunctionCalls.md](/examples/AnalyzeScientificPapersUsingFunctionCalls.md): Learn how to create agents capable of executing MATLAB functions.
- [AnalyzeTextDataUsingParallelFunctionCallwithChatGPT.md](/examples/AnalyzeTextDataUsingParallelFunctionCallwithChatGPT.md): Learn how to take advantage of parallel function calling.
- [RetrievalAugmentedGenerationUsingChatGPTandMATLAB.md](/examples/RetrievalAugmentedGenerationUsingChatGPTandMATLAB.md): Learn about retrieval augmented generation with a simple use case. (Requires Text Analytics Toolbox™)
- [DescribeImagesUsingChatGPT.md](/examples/DescribeImagesUsingChatGPT.md): Learn how to use GPT-4 Turbo with Vision to understand the content of an image.
- [AnalyzeSentimentinTextUsingChatGPTinJSONMode.md](/examples/AnalyzeSentimentinTextUsingChatGPTinJSONMode.md): Learn how to use JSON mode in chat completions
- [UsingDALLEToEditImages.md](/examples/UsingDALLEToEditImages.md): Learn how to generate images
- [UsingDALLEToGenerateImages.md](/examples/UsingDALLEToGenerateImages.md): Create variations of images and editimages.

## License

Expand Down
36 changes: 35 additions & 1 deletion azureChat.m
Original file line number Diff line number Diff line change
Expand Up @@ -191,7 +191,7 @@
if isstring(messages) && isscalar(messages)
messagesStruct = {struct("role", "user", "content", messages)};
else
messagesStruct = messages.Messages;
messagesStruct = this.encodeImages(messages.Messages);
end

if ~isempty(this.SystemPrompt)
Expand Down Expand Up @@ -251,6 +251,40 @@ function mustBeValidFunctionCall(this, functionCall)
end

end

function messageStruct = encodeImages(~, messageStruct)
for k=1:numel(messageStruct)
if isfield(messageStruct{k},"images")
images = messageStruct{k}.images;
detail = messageStruct{k}.image_detail;
messageStruct{k} = rmfield(messageStruct{k},["images","image_detail"]);
messageStruct{k}.content = ...
{struct("type","text","text",messageStruct{k}.content)};
for img = images(:).'
if startsWith(img,("https://"|"http://"))
s = struct( ...
"type","image_url", ...
"image_url",struct("url",img));
else
[~,~,ext] = fileparts(img);
MIMEType = "data:image/" + erase(ext,".") + ";base64,";
% Base64 encode the image using the given MIME type
fid = fopen(img);
im = fread(fid,'*uint8');
fclose(fid);
b64 = matlab.net.base64encode(im);
s = struct( ...
"type","image_url", ...
"image_url",struct("url",MIMEType + b64));
end

s.image_url.detail = detail;

messageStruct{k}.content{end+1} = s;
end
end
end
end
end
end

Expand Down
19 changes: 16 additions & 3 deletions doc/Azure.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Connecting to Azure® OpenAI Service
# Connecting to Azure OpenAI Service

This repository contains code to connect MATLAB to the [Azure® OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-services/openai/).
This repository contains code to connect MATLAB to the [Azure® OpenAI® Service](https://learn.microsoft.com/en-us/azure/ai-services/openai/).

To use Azure OpenAI Services, you need to create a model deployment on your Azure account and obtain one of the keys for it. You are responsible for any fees Azure may charge for the use of their APIs. You should be familiar with the limitations and risks associated with using this technology, and you agree that you shall be solely responsible for full compliance with any terms that may apply to your use of the Azure APIs.

Expand Down Expand Up @@ -31,7 +31,7 @@ loadenv(".env")

## Establishing a connection to Chat Completions API using Azure

To connect MATLAB to Chat Completions API via Azure, you will have to create an `azureChat` object. See [the Azure documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/chatgpt-quickstart) for details on the setup required and where to find your key, endpoint, and deployment name. As explained above, the endpoint, deployment, and key should be in the environment variables `AZURE_OPENAI_ENDPOINT`, `AZURE_OPENAI_DEPLOYMENYT`, and `AZURE_OPENAI_API_KEY`, or provided as `Endpoint=…`, `Deployment=…`, and `APIKey=…` in the `azureChat` call below.
To connect MATLAB® to Chat Completions API via Azure, you will have to create an `azureChat` object. See [the Azure documentation](https://learn.microsoft.com/en-us/azure/ai-services/openai/chatgpt-quickstart) for details on the setup required and where to find your key, endpoint, and deployment name. As explained above, the endpoint, deployment, and key should be in the environment variables `AZURE_OPENAI_ENDPOINT`, `AZURE_OPENAI_DEPLOYMENYT`, and `AZURE_OPENAI_API_KEY`, or provided as `Endpoint=…`, `Deployment=…`, and `APIKey=…` in the `azureChat` call below.

In order to create the chat assistant, use the `azureChat` function, optionally providing a system prompt:
```matlab
Expand Down Expand Up @@ -115,6 +115,19 @@ txt = generate(chat,"What is Model-Based Design and how is it related to Digital
% Should stream the response token by token
```

## Understanding the content of an image

You can use gpt-4o, gpt-4o-mini, or gpt-4-turbo to experiment with image understanding.
```matlab
chat = azureChat("You are an AI assistant.",Deployment="gpt-4o");
image_path = "peppers.png";
messages = messageHistory;
messages = addUserMessageWithImages(messages,"What is in the image?",image_path);
[txt,response] = generate(chat,messages,MaxNumTokens=4096);
txt
% outputs a description of the image
```

## Calling MATLAB functions with the API

Optionally, `Tools=functions` can be used to provide function specifications to the API. The purpose of this is to enable models to generate function arguments which adhere to the provided specifications.
Expand Down
Loading