Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/use microsoft.extensions.ai #122

Draft
wants to merge 4 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ jobs:
- name: Setup .NET Core
uses: actions/setup-dotnet@v4
with:
dotnet-version: 8.0.x
dotnet-version: 9.0.x

- name: Install DotNet workload - WASM tools
run: |
Expand Down
4 changes: 2 additions & 2 deletions doc/SYSTEMS_C64_AI_CODE_COMPLETION.md
Original file line number Diff line number Diff line change
Expand Up @@ -86,12 +86,12 @@ Configure `CodingAssistant` section in `appsettings.json`.
Using OpenAI:
- `CodingAssistantType:OpenAI:CodingAssistantType`: `OpenAI`
- `CodingAssistantType:OpenAI:ApiKey`: Your own OpenAI API key
- `CodingAssistantType:OpenAI:DeploymentName`: The OpenAI model (default: `gpt-4o`)
- `CodingAssistantType:OpenAI:ModelName`: The OpenAI model (default: `gpt-4o`)

Using self-hosted OpenAI API compatible LLM (Ollama with CodeLlama-code model):
- `CodingAssistantType:OpenAISelfHostedCodeLlama:CodingAssistantType`: `OpenAI`
- `CodingAssistantType:OpenAISelfHostedCodeLlama:EndPoint`: The local Ollama HTTP endpoint (ex: `http://localhost:11434/api`)
- `CodingAssistantType:OpenAISelfHostedCodeLlama:DeploymentName`: A local CodeLlama-code model (ex: `codellama:13b-code` or `codellama:7b-code`.)
- `CodingAssistantType:OpenAISelfHostedCodeLlama:ModelName`: A local CodeLlama-code model (ex: `codellama:13b-code` or `codellama:7b-code`.)
- `CodingAssistantType:OpenAISelfHostedCodeLlama:ApiKey`: Optional. May be required if Open WebUI proxy is in front of Ollama.

Using custom AI backend:
Expand Down
29 changes: 14 additions & 15 deletions src/apps/Highbyte.DotNet6502.App.SadConsole/appsettings.json
Original file line number Diff line number Diff line change
Expand Up @@ -50,29 +50,29 @@
},

"CodingAssistant": {
"CodingAssistantType": "OpenAI", // "None", "OpenAI", "CustomEndpoint"
"CodingAssistantType": "OpenAI", // "None", "OpenAI", "OpenAISelfHostedCodeLlama", "CustomEndpoint"

"OpenAI": {
// Set to true to enable OpenAI Basic coding assistant. Also requires an API key (see below).
"Enabled": false,

// dotnet user-secrets set "CodingAssistant:OpenAI:ApiKey" "[MY API KEY]"
"ApiKey": "[SET IN DOTNET USER SECRETS]",
"ModelName": "gpt-4o" // Works good
//"ModelName": "gpt-3.5-turbo", // Don't work
//"ModelName": "gpt-4-turbo" // Works somewhat
//"ModelName": "gpt-4o-mini" // Works a bit better sometimes?
//"ModelName": "chatgpt-4o-latest" // Works good

//"DeploymentName": "gpt-3.5-turbo", // Don't work
//"DeploymentName": "gpt-4-turbo" // Works somewhat
//"DeploymentName": "gpt-4o-mini" // Works a bit better sometimes?
"DeploymentName": "gpt-4o" // Works good
//"DeploymentName": "chatgpt-4o-latest" // Works good

// Required for Azure OpenAI only. If you're using OpenAI, remove the following line.
//"Endpoint": "https://YOUR_ACCOUNT.openai.azure.com/"
},

// TODO: support Azure OpenAI
//"AzureOpenAI": {
// "Endpoint": "https://YOUR_ACCOUNT.openai.azure.com/",
// "ModelName": "gpt-4o"
//},

"OpenAISelfHostedCodeLlama": {
"Endpoint": "http://localhost:11434/api",
//"DeploymentName": "codellama:7b-code", // Works sometimes (must be a CodeLlama:xxx-code model to work).
"DeploymentName": "codellama:13b-code" // Works ok (must be a CodeLlama:xxx-code model to work)
//"ModelName": "codellama:7b-code", // Works sometimes (must be a CodeLlama:xxx-code model to work).
"ModelName": "codellama:13b-code" // Works ok (must be a CodeLlama:xxx-code model to work)
//"ApiKey": "[SET IN DOTNET USER SECRETS]" // API key may not be required for self-hosted
},

Expand All @@ -83,6 +83,5 @@
// dotnet user-secrets set "CodingAssistant:CustomEndpoint:ApiKey" "[MY API KEY]"
"ApiKey": "[SET IN DOTNET USER SECRETS]"
}

}
}
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@
using Highbyte.DotNet6502.Systems.Commodore64.Config;
using Highbyte.DotNet6502.Systems.Commodore64.Models;
using Blazored.LocalStorage;
using Highbyte.DotNet6502.AI.CodingAssistant.Inference.OpenAI;
using Highbyte.DotNet6502.AI.CodingAssistant;
using Highbyte.DotNet6502.Systems.Commodore64.Utils.BasicAssistant;
using static Highbyte.DotNet6502.AI.CodingAssistant.CustomAIEndpointCodeSuggestion;
using System.Text.Json;
using Highbyte.DotNet6502.AI.CodingAssistant.Inference.BackendConfig;

namespace Highbyte.DotNet6502.App.WASM.Emulator.SystemSetup;

Expand Down Expand Up @@ -208,12 +208,14 @@ public static async Task<ICodeSuggestion> GetCodeSuggestionImplementation(C64Hos
if (c64HostConfig.CodeSuggestionBackendType == CodeSuggestionBackendTypeEnum.OpenAI)
{
var openAIApiConfig = await GetOpenAIConfig(localStorageService);
codeSuggestion = OpenAICodeSuggestion.CreateOpenAICodeSuggestion(openAIApiConfig, C64BasicCodingAssistant.CODE_COMPLETION_LANGUAGE_DESCRIPTION, C64BasicCodingAssistant.CODE_COMPLETION_ADDITIONAL_SYSTEM_INSTRUCTION);
var chatClient = ChatClientFactory.CreateOpenAIChatClient(openAIApiConfig);
codeSuggestion = OpenAICodeSuggestion.CreateOpenAICodeSuggestion(chatClient, C64BasicCodingAssistant.CODE_COMPLETION_LANGUAGE_DESCRIPTION, C64BasicCodingAssistant.CODE_COMPLETION_ADDITIONAL_SYSTEM_INSTRUCTION);
}
else if (c64HostConfig.CodeSuggestionBackendType == CodeSuggestionBackendTypeEnum.OpenAISelfHostedCodeLlama)
{
var openAIApiConfig = await GetOpenAISelfHostedCodeLlamaConfig(localStorageService);
codeSuggestion = OpenAICodeSuggestion.CreateOpenAICodeSuggestionForCodeLlama(openAIApiConfig, C64BasicCodingAssistant.CODE_COMPLETION_LANGUAGE_DESCRIPTION, C64BasicCodingAssistant.CODE_COMPLETION_ADDITIONAL_SYSTEM_INSTRUCTION);
var ollamaConfig = await GetOpenAISelfHostedCodeLlamaConfig(localStorageService);
var chatClient = ChatClientFactory.CreateOllamaChatClient(ollamaConfig);
codeSuggestion = OpenAICodeSuggestion.CreateOpenAICodeSuggestionForCodeLlama(chatClient, C64BasicCodingAssistant.CODE_COMPLETION_LANGUAGE_DESCRIPTION, C64BasicCodingAssistant.CODE_COMPLETION_ADDITIONAL_SYSTEM_INSTRUCTION);
}
else if (c64HostConfig.CodeSuggestionBackendType == CodeSuggestionBackendTypeEnum.CustomEndpoint)
{
Expand All @@ -240,47 +242,52 @@ public static async Task<ICodeSuggestion> GetCodeSuggestionImplementation(C64Hos

}

public static async Task<ApiConfig> GetOpenAIConfig(ILocalStorageService localStorageService)
public static async Task<OpenAIConfig> GetOpenAIConfig(ILocalStorageService localStorageService)
{
var apiKey = await localStorageService.GetItemAsStringAsync($"{ApiConfig.CONFIG_SECTION}:ApiKey");
var apiKey = await localStorageService.GetItemAsStringAsync($"{OpenAIConfig.CONFIG_SECTION}:ApiKey");

var deploymentName = await localStorageService.GetItemAsStringAsync($"{ApiConfig.CONFIG_SECTION}:DeploymentName");
if (string.IsNullOrEmpty(deploymentName))
deploymentName = "gpt-4o"; // Default to a OpenAI model that works well
// Model name is in ModelName (current) or DeploymentName (legacy)
var modelName = await localStorageService.GetItemAsStringAsync($"{OpenAIConfig.CONFIG_SECTION}:ModelName");
if (string.IsNullOrEmpty(modelName))
modelName = await localStorageService.GetItemAsStringAsync($"{OpenAIConfig.CONFIG_SECTION}:DeploymentName");
if (string.IsNullOrEmpty(modelName))
modelName = "gpt-4o"; // Default to a OpenAI model that works well

// For future use: Endpoint can be set if OpenAI is accessed via Azure endpoint.
//var endpoint = await localStorageService.GetItemAsStringAsync($"{ApiConfig.CONFIG_SECTION}:Endpoint");
//Uri.TryCreate(endpoint, UriKind.Absolute, out var endPointUri);

var apiConfig = new ApiConfig()
var apiConfig = new OpenAIConfig()
{
ApiKey = apiKey, // Api key for OpenAI (required), Azure OpenAI (required), or SelfHosted (optional).
DeploymentName = deploymentName, // AI model name
//Endpoint = endPointUri, // Used if using Azure OpenAI
SelfHosted = false,
ModelName = modelName, // AI model name
};
return apiConfig;
}

public static async Task<ApiConfig> GetOpenAISelfHostedCodeLlamaConfig(ILocalStorageService localStorageService)
public static async Task<OllamaConfig> GetOpenAISelfHostedCodeLlamaConfig(ILocalStorageService localStorageService)
{
var apiKey = await localStorageService.GetItemAsStringAsync($"{ApiConfig.CONFIG_SECTION_SELF_HOSTED}:ApiKey");
var apiKey = await localStorageService.GetItemAsStringAsync($"{OllamaConfig.CONFIG_SECTION}:ApiKey");
if (apiKey == string.Empty)
apiKey = null;
var deploymentName = await localStorageService.GetItemAsStringAsync($"{ApiConfig.CONFIG_SECTION_SELF_HOSTED}:DeploymentName");
if (string.IsNullOrEmpty(deploymentName))
deploymentName = "codellama:13b-code"; // Default to a Ollama CodeLlama-code model that seems to work OK (but not as good as OpenAI gpt-4o)
var endpoint = await localStorageService.GetItemAsStringAsync($"{ApiConfig.CONFIG_SECTION_SELF_HOSTED}:Endpoint");

// Model name is in ModelName (current) or DeploymentName (legacy)
var modelName = await localStorageService.GetItemAsStringAsync($"{OllamaConfig.CONFIG_SECTION}:ModelName");
if (string.IsNullOrEmpty(modelName))
modelName = await localStorageService.GetItemAsStringAsync($"{OllamaConfig.CONFIG_SECTION}:DeploymentName");
if (string.IsNullOrEmpty(modelName))
modelName = "codellama:13b-code"; // Default to a Ollama CodeLlama-code model that seems to work OK (but not as good as OpenAI gpt-4o)

var endpoint = await localStorageService.GetItemAsStringAsync($"{OllamaConfig.CONFIG_SECTION}:Endpoint");
if (string.IsNullOrEmpty(endpoint))
endpoint = "http://localhost:11434/api"; // Default to local Ollama
Uri.TryCreate(endpoint, UriKind.Absolute, out var endPointUri);

var apiConfig = new ApiConfig()
var apiConfig = new OllamaConfig()
{
ApiKey = apiKey, // Optional for Self-hosted model.
DeploymentName = deploymentName, // AI CodeLlama-code model name (ex: codellama:13b-code, codellama:7b-code)
ModelName = modelName, // AI CodeLlama-code model name (ex: codellama:13b-code, codellama:7b-code)
Endpoint = endPointUri, // Self-hosted OpenAI API compatible endpoint (for example Ollama)
SelfHosted = true // Set to true to use self-hosted OpenAI API compatible endpoint.
};
return apiConfig;
}
Expand All @@ -304,17 +311,16 @@ public static async Task<CustomAIEndpointConfig> GetCustomAIEndpointConfig(ILoca
return apiConfig;
}

public static async Task SaveOpenAICodingAssistantConfigToLocalStorage(ILocalStorageService localStorageService, ApiConfig apiConfig)
public static async Task SaveOpenAICodingAssistantConfigToLocalStorage(ILocalStorageService localStorageService, OpenAIConfig openAIConfig)
{
await localStorageService.SetItemAsStringAsync($"{ApiConfig.CONFIG_SECTION}:ApiKey", apiConfig.ApiKey ?? string.Empty);
//await localStorageService.SetItemAsStringAsync($"{ApiConfig.CONFIG_SECTION}:Endpoint", apiConfig.Endpoint != null ? apiConfig.Endpoint.OriginalString : string.Empty);
await localStorageService.SetItemAsStringAsync($"{OpenAIConfig.CONFIG_SECTION}:ApiKey", openAIConfig.ApiKey ?? string.Empty);
}

public static async Task SaveOpenAISelfHostedCodeLlamaCodingAssistantConfigToLocalStorage(ILocalStorageService localStorageService, ApiConfig apiConfig)
public static async Task SaveOpenAISelfHostedCodeLlamaCodingAssistantConfigToLocalStorage(ILocalStorageService localStorageService, OllamaConfig ollamaConfig)
{
await localStorageService.SetItemAsStringAsync($"{ApiConfig.CONFIG_SECTION_SELF_HOSTED}:ApiKey", apiConfig.ApiKey ?? string.Empty);
await localStorageService.SetItemAsStringAsync($"{ApiConfig.CONFIG_SECTION_SELF_HOSTED}:DeploymentName", apiConfig.DeploymentName ?? string.Empty);
await localStorageService.SetItemAsStringAsync($"{ApiConfig.CONFIG_SECTION_SELF_HOSTED}:Endpoint", apiConfig.Endpoint != null ? apiConfig.Endpoint.OriginalString : string.Empty);
await localStorageService.SetItemAsStringAsync($"{OllamaConfig.CONFIG_SECTION}:ApiKey", ollamaConfig.ApiKey ?? string.Empty);
await localStorageService.SetItemAsStringAsync($"{OllamaConfig.CONFIG_SECTION}:ModelName", ollamaConfig.ModelName ?? string.Empty);
await localStorageService.SetItemAsStringAsync($"{OllamaConfig.CONFIG_SECTION}:Endpoint", ollamaConfig.Endpoint != null ? ollamaConfig.Endpoint.OriginalString : string.Empty);
}

public static async Task SaveCustomCodingAssistantConfigToLocalStorage(ILocalStorageService localStorageService, CustomAIEndpointConfig customAIEndpointConfig)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
<PackageReference Include="Microsoft.AspNetCore.WebUtilities" Version="8.0.7" />
<PackageReference Include="SkiaSharp.Views.Blazor" Version="3.0.0-preview.4.1" />
<PackageReference Include="PublishSPAforGitHubPages.Build" Version="2.2.0" />
<PackageReference Include="System.Net.Http.Json" Version="8.0.0" />
<PackageReference Include="System.Net.Http.Json" Version="9.0.0-rc.2.24473.5" />
<PackageReference Include="TextCopy" Version="6.2.1" />
<PackageReference Include="Toolbelt.Blazor.Gamepad" Version="9.0.0" />
</ItemGroup>
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
@using Blazored.LocalStorage
@using Highbyte.DotNet6502.AI.CodingAssistant
@using Highbyte.DotNet6502.AI.CodingAssistant.Inference.OpenAI
@using Highbyte.DotNet6502.AI.CodingAssistant.Inference.BackendConfig
@using Highbyte.DotNet6502.App.WASM.Emulator.SystemSetup
@using Highbyte.DotNet6502.Impl.AspNet.Commodore64.Input;
@using Highbyte.DotNet6502.Systems;
Expand Down Expand Up @@ -222,7 +222,7 @@
<div class="table-cell table-cell-fixed-width-large twocol">
@if (_openAISelfHostedCodeLlamaAIApiConfig != null)
{
<InputText @ref="_openAISelfHostedCodeLlamaModelNameInputText" @bind-Value="_openAISelfHostedCodeLlamaAIApiConfig.DeploymentName" style="width: inherit" />
<InputText @ref="_openAISelfHostedCodeLlamaModelNameInputText" @bind-Value="_openAISelfHostedCodeLlamaAIApiConfig.ModelName" style="width: inherit" />
}
</div>
</div>
Expand Down Expand Up @@ -310,10 +310,10 @@

private void UnloadROMs() => C64HostConfig.SystemConfig.ROMs = new List<ROM>();

private ApiConfig _openAIApiConfig = default!;
private OpenAIConfig _openAIApiConfig = default!;
private InputText _openAIApiKeyInputText = default!;

private ApiConfig _openAISelfHostedCodeLlamaAIApiConfig = default!;
private OllamaConfig _openAISelfHostedCodeLlamaAIApiConfig = default!;
private InputText _openAISelfHostedCodeLlamaEndpointInputText = default!;
private InputText _openAISelfHostedCodeLlamaModelNameInputText = default!;
private InputText _openAISelfHostedCodeLlamaApiKeyInputText = default!;
Expand Down Expand Up @@ -433,11 +433,13 @@
ICodeSuggestion codeSuggestion;
if(C64HostConfig.CodeSuggestionBackendType == CodeSuggestionBackendTypeEnum.OpenAI)
{
codeSuggestion = OpenAICodeSuggestion.CreateOpenAICodeSuggestion(_openAIApiConfig, C64BasicCodingAssistant.CODE_COMPLETION_LANGUAGE_DESCRIPTION, C64BasicCodingAssistant.CODE_COMPLETION_ADDITIONAL_SYSTEM_INSTRUCTION);
var chatClient = ChatClientFactory.CreateOpenAIChatClient(_openAIApiConfig);
codeSuggestion = OpenAICodeSuggestion.CreateOpenAICodeSuggestion(chatClient, C64BasicCodingAssistant.CODE_COMPLETION_LANGUAGE_DESCRIPTION, C64BasicCodingAssistant.CODE_COMPLETION_ADDITIONAL_SYSTEM_INSTRUCTION);
}
else if(C64HostConfig.CodeSuggestionBackendType == CodeSuggestionBackendTypeEnum.OpenAISelfHostedCodeLlama)
{
codeSuggestion = OpenAICodeSuggestion.CreateOpenAICodeSuggestionForCodeLlama(_openAISelfHostedCodeLlamaAIApiConfig, C64BasicCodingAssistant.CODE_COMPLETION_LANGUAGE_DESCRIPTION, C64BasicCodingAssistant.CODE_COMPLETION_ADDITIONAL_SYSTEM_INSTRUCTION);
var chatClient = ChatClientFactory.CreateOllamaChatClient(_openAISelfHostedCodeLlamaAIApiConfig);
codeSuggestion = OpenAICodeSuggestion.CreateOpenAICodeSuggestionForCodeLlama(chatClient, C64BasicCodingAssistant.CODE_COMPLETION_LANGUAGE_DESCRIPTION, C64BasicCodingAssistant.CODE_COMPLETION_ADDITIONAL_SYSTEM_INSTRUCTION);
}
else if(C64HostConfig.CodeSuggestionBackendType == CodeSuggestionBackendTypeEnum.CustomEndpoint)
{
Expand Down
2 changes: 0 additions & 2 deletions src/apps/Highbyte.DotNet6502.App.WASM/Pages/Index.razor.cs
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,6 @@
using Microsoft.AspNetCore.WebUtilities;
using Toolbelt.Blazor.Gamepad;
using Highbyte.DotNet6502.Systems.Logging.Console;
using Microsoft.AspNetCore.Components.Rendering;
using Microsoft.Extensions.Azure;

namespace Highbyte.DotNet6502.App.WASM.Pages;

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
using Microsoft.Extensions.Configuration;

namespace Highbyte.DotNet6502.AI.CodingAssistant.Inference.BackendConfig;

public class AzureOpenAIConfig
{
public string ModelName { get; set; }
public Uri Endpoint { get; set; }


public const string CONFIG_SECTION = "CodingAssistant:AzureOpenAI";

public AzureOpenAIConfig()

Check warning on line 13 in src/libraries/Highbyte.DotNet6502.AI/CodingAssistant/Inference/BackendConfig/AzureOpenAIConfig.cs

View workflow job for this annotation

GitHub Actions / build

Non-nullable property 'ModelName' must contain a non-null value when exiting constructor. Consider adding the 'required' modifier or declaring the property as nullable.

Check warning on line 13 in src/libraries/Highbyte.DotNet6502.AI/CodingAssistant/Inference/BackendConfig/AzureOpenAIConfig.cs

View workflow job for this annotation

GitHub Actions / build

Non-nullable property 'Endpoint' must contain a non-null value when exiting constructor. Consider adding the 'required' modifier or declaring the property as nullable.

Check warning on line 13 in src/libraries/Highbyte.DotNet6502.AI/CodingAssistant/Inference/BackendConfig/AzureOpenAIConfig.cs

View workflow job for this annotation

GitHub Actions / Analyze (csharp)

Non-nullable property 'ModelName' must contain a non-null value when exiting constructor. Consider adding the 'required' modifier or declaring the property as nullable.

Check warning on line 13 in src/libraries/Highbyte.DotNet6502.AI/CodingAssistant/Inference/BackendConfig/AzureOpenAIConfig.cs

View workflow job for this annotation

GitHub Actions / Analyze (csharp)

Non-nullable property 'Endpoint' must contain a non-null value when exiting constructor. Consider adding the 'required' modifier or declaring the property as nullable.
{
}

public AzureOpenAIConfig(IConfiguration config)
{
var configSection = config.GetRequiredSection(CONFIG_SECTION);
Endpoint = configSection.GetValue<Uri>("Endpoint")
?? throw new InvalidOperationException($"Missing required configuration value: {CONFIG_SECTION}:Endpoint");
ModelName = configSection.GetValue<string>("ModelName")
?? throw new InvalidOperationException($"Missing required configuration value: {CONFIG_SECTION}:ModelName");
}
}
Loading
Loading