Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Support AWS Bedrock #7

Merged
merged 9 commits into from
Mar 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 15 additions & 1 deletion Claudia.sln
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,11 @@ Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "BlazorApp1", "sandbox\Blazo
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Claudia.FunctionGenerator", "src\Claudia.FunctionGenerator\Claudia.FunctionGenerator.csproj", "{8C464111-AD67-4D2B-9AE2-0B52AB077EBD}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Claudia.FunctionGenerator.Tests", "tests\Claudia.FunctionGenerator.Tests\Claudia.FunctionGenerator.Tests.csproj", "{89A58A08-F553-4CEA-A2A8-783009501E05}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Claudia.FunctionGenerator.Tests", "tests\Claudia.FunctionGenerator.Tests\Claudia.FunctionGenerator.Tests.csproj", "{89A58A08-F553-4CEA-A2A8-783009501E05}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "BedrockConsoleApp", "sandbox\BedrockConsoleApp\BedrockConsoleApp.csproj", "{79C84272-E0AB-4918-9454-B0AEA9CBE40A}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Claudia.Bedrock", "src\Claudia.Bedrock\Claudia.Bedrock.csproj", "{9EC270A6-6E6F-44CF-8A4C-975A2A7344AA}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Expand Down Expand Up @@ -51,6 +55,14 @@ Global
{89A58A08-F553-4CEA-A2A8-783009501E05}.Debug|Any CPU.Build.0 = Debug|Any CPU
{89A58A08-F553-4CEA-A2A8-783009501E05}.Release|Any CPU.ActiveCfg = Release|Any CPU
{89A58A08-F553-4CEA-A2A8-783009501E05}.Release|Any CPU.Build.0 = Release|Any CPU
{79C84272-E0AB-4918-9454-B0AEA9CBE40A}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{79C84272-E0AB-4918-9454-B0AEA9CBE40A}.Debug|Any CPU.Build.0 = Debug|Any CPU
{79C84272-E0AB-4918-9454-B0AEA9CBE40A}.Release|Any CPU.ActiveCfg = Release|Any CPU
{79C84272-E0AB-4918-9454-B0AEA9CBE40A}.Release|Any CPU.Build.0 = Release|Any CPU
{9EC270A6-6E6F-44CF-8A4C-975A2A7344AA}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{9EC270A6-6E6F-44CF-8A4C-975A2A7344AA}.Debug|Any CPU.Build.0 = Debug|Any CPU
{9EC270A6-6E6F-44CF-8A4C-975A2A7344AA}.Release|Any CPU.ActiveCfg = Release|Any CPU
{9EC270A6-6E6F-44CF-8A4C-975A2A7344AA}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
Expand All @@ -62,6 +74,8 @@ Global
{8EEB0F69-132B-4887-959D-25531588FCD2} = {E61BFC87-2B96-4699-9B69-EE4B008AE0A0}
{8C464111-AD67-4D2B-9AE2-0B52AB077EBD} = {B54A8855-F8F0-4015-80AA-86974E65AC2D}
{89A58A08-F553-4CEA-A2A8-783009501E05} = {1B4BD6F6-8528-4409-BA55-085DA5486D36}
{79C84272-E0AB-4918-9454-B0AEA9CBE40A} = {E61BFC87-2B96-4699-9B69-EE4B008AE0A0}
{9EC270A6-6E6F-44CF-8A4C-975A2A7344AA} = {B54A8855-F8F0-4015-80AA-86974E65AC2D}
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
SolutionGuid = {B7CEBA02-BB0C-4102-AE58-DFD114C3192A}
Expand Down
63 changes: 63 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@ This library is distributed via NuGet, supporting .NET Standard 2.1, .NET 6(.NET

It can also be used with Unity Game Engine both Runtime and Editor. For instructions on how to use it, please refer to the [Unity section](#unity).

You can also use it with AWS Bedrock. Check the [AWS Bedrock section](#aws-bedrock) for more details.

Usage
---
For details about the API, please check the [official API reference](https://docs.anthropic.com/claude/reference/getting-started-with-the-api).
Expand Down Expand Up @@ -829,6 +831,67 @@ public partial class Home

If you need to store the chat message history, you can serialize `List<Message> chatMessages` to JSON and save it to a file or database.

AWS Bedrock
---
We provide support for the [Anthropic Bedrock API](https://aws.amazon.com/bedrock/claude/) through a separate package.

> PM> Install-Package [Claudia.Bedrock](https://www.nuget.org/packages/Claudia.Bedrock)

To create an `AmazonBedrockRuntimeClient` from the AWS SDK and specify the Bedrock Model ID using `UseAnthropic`, set the Model property of RequestMessage to `anthropic_version`. The rest is the same as a regular Anthropic Client.

```csharp
// credentials is your own
AWSConfigs.AWSProfileName = "";

var bedrock = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1);
var anthropic = bedrock.UseAnthropic("anthropic.claude-3-haiku-20240307-v1:0"); // Model Id

var response = await anthropic.Messages.CreateAsync(new()
{
Model = "bedrock-2023-05-31", // anthropic_version
MaxTokens = 1024,
Messages = [new() { Role = "user", Content = "Hello, Claude" }]
});

Console.WriteLine(response);
```

Streaming Messages work in the same way.

```csharp
var stream = anthropic.Messages.CreateStreamAsync(new()
{
Model = "bedrock-2023-05-31", // anthropic_version
MaxTokens = 1024,
Messages = [new() { Role = "user", Content = "Hello, Claude" }]
});

await foreach (var item in stream)
{
Console.WriteLine(item);
}
```

If you need the raw response, call `InvokeModelAsync` or `InvokeModelWithResponseStreamAsync` instead. This allows you to check the status code and headers before retrieving the result with `GetMessageResponse` or `GetMessageResponseAsync`.

```csharp
var bedrock = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1);

// (string modelId, MessageRequest request)
var response = await bedrock.InvokeModelAsync("anthropic.claude-3-haiku-20240307-v1:0", new()
{
Model = "bedrock-2023-05-31", // anthropic_version
MaxTokens = 1024,
Messages = [new() { Role = "user", Content = "Hello, Claude" }]
});

Console.WriteLine(response.ResponseMetadata.RequestId);

var responseMessage = response.GetMessageResponse();

Console.WriteLine(responseMessage);
```

Unity
---
Minimum supported Unity version is `2022.3.12f1`. You need to install from NuGet. We recommend using [NuGetForUnity](https://github.com/GlitchEnzo/NuGetForUnity).
Expand Down
20 changes: 20 additions & 0 deletions sandbox/BedrockConsoleApp/BedrockConsoleApp.csproj
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
<Project Sdk="Microsoft.NET.Sdk">

<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net8.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>

<ItemGroup>
<PackageReference Include="AWSSDK.BedrockRuntime" Version="3.7.301.47" />
<PackageReference Include="AWSSDK.SecurityToken" Version="3.7.300.59" />
</ItemGroup>

<ItemGroup>
<ProjectReference Include="..\..\src\Claudia.Bedrock\Claudia.Bedrock.csproj" />
<ProjectReference Include="..\..\src\Claudia\Claudia.csproj" />
</ItemGroup>

</Project>
63 changes: 63 additions & 0 deletions sandbox/BedrockConsoleApp/Program.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
using Amazon;
using Amazon.BedrockRuntime;
using Amazon.BedrockRuntime.Model;
using Amazon.BedrockRuntime.Model.Internal.MarshallTransformations;
using Amazon.Runtime.EventStreams.Internal;
using Claudia;
using System.Buffers;
using System.Collections.Generic;
using System.Formats.Asn1;
using System.IO.Pipelines;
using System.Reflection.PortableExecutable;
using System.Runtime.CompilerServices;
using System.Runtime.InteropServices.ObjectiveC;
using System.Text;
using System.Text.Json;
using System.Threading.Channels;
using ThirdParty.Json.LitJson;


// credentials is your own
AWSConfigs.AWSProfileName = "";

//var bedrock = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1);
//var anthropic = bedrock.UseAnthropic("anthropic.claude-3-haiku-20240307-v1:0");

//var response = await anthropic.Messages.CreateAsync(new()
//{
// Model = "bedrock-2023-05-31",
// MaxTokens = 1024,
// Messages = [new() { Role = "user", Content = "Hello, Claude" }]
//});

//Console.WriteLine(response);


//var stream = anthropic.Messages.CreateStreamAsync(new()
//{
// Model = "bedrock-2023-05-31",
// MaxTokens = 1024,
// Messages = [new() { Role = "user", Content = "Hello, Claude" }]
//});

//await foreach (var item in stream)
//{
// Console.WriteLine(item);
//}


var bedrock = new AmazonBedrockRuntimeClient(RegionEndpoint.USEast1);

// (string modelId, MessageRequest request)
var response = await bedrock.InvokeModelAsync("anthropic.claude-3-haiku-20240307-v1:0", new()
{
Model = "bedrock-2023-05-31", // anthropic_version
MaxTokens = 1024,
Messages = [new() { Role = "user", Content = "Hello, Claude" }]
});

Console.WriteLine(response.ResponseMetadata.RequestId);

var responseMessage = response.GetMessageResponse();

Console.WriteLine(responseMessage);
110 changes: 110 additions & 0 deletions src/Claudia.Bedrock/BedrockAnthropicJsonSerialzierContext.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
using System.Text.Json;
using System.Text.Json.Serialization;

namespace Claudia;

internal static class BedrockAnthropicJsonSerialzierContext
{
public static JsonSerializerOptions Options { get; }

static BedrockAnthropicJsonSerialzierContext()
{
var options = new JsonSerializerOptions(InternalBedrockAnthropicJsonSerialzierContext.Default.Options);
options.TypeInfoResolverChain.Add(AnthropicJsonSerialzierContext.Default.Options.TypeInfoResolver!);
options.MakeReadOnly();

Options = options;
}
}

[JsonSourceGenerationOptions(
GenerationMode = JsonSourceGenerationMode.Default,
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
WriteIndented = false)]
[JsonSerializable(typeof(BedrockMessageRequest))]
internal partial class InternalBedrockAnthropicJsonSerialzierContext : JsonSerializerContext
{
}

// "model" -> "anthropic_version"
internal record class BedrockMessageRequest
{
/// <summary>
/// The model that will complete your prompt.
/// </summary>
// [JsonPropertyName("model")]
[JsonPropertyName("anthropic_version")]
public required string Model { get; set; }

/// <summary>
/// The maximum number of tokens to generate before stopping.
/// Note that our models may stop before reaching this maximum.This parameter only specifies the absolute maximum number of tokens to generate.
/// Different models have different maximum values for this parameter
/// </summary>
[JsonPropertyName("max_tokens")]
public required int MaxTokens { get; set; }

/// <summary>
/// Input messages.
/// </summary>
[JsonPropertyName("messages")]
public required Message[] Messages { get; set; }

// optional parameters

/// <summary>
/// System prompt.
/// A system prompt is a way of providing context and instructions to Claude, such as specifying a particular goal or role.
/// </summary>
[JsonPropertyName("system")]
public string? System { get; set; }

/// <summary>
/// An object describing metadata about the request.
/// </summary>
[JsonPropertyName("metadata")]
public Metadata? Metadata { get; set; }

/// <summary>
/// Custom text sequences that will cause the model to stop generating.
/// Our models will normally stop when they have naturally completed their turn, which will result in a response stop_reason of "end_turn".
/// If you want the model to stop generating when it encounters custom strings of text, you can use the stop_sequences parameter.If the model encounters one of the custom sequences, the response stop_reason value will be "stop_sequence" and the response stop_sequence value will contain the matched stop sequence.
/// </summary>
[JsonPropertyName("stop_sequences")]
public string[]? StopSequences { get; set; }

/// <summary>
/// Whether to incrementally stream the response using server-sent events.
/// </summary>
[JsonPropertyName("stream")]
[JsonInclude] // internal so requires Include.
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingDefault)]
internal bool? Stream { get; set; }

/// <summary>
/// Amount of randomness injected into the response.
/// Defaults to 1.0. Ranges from 0.0 to 1.0. Use temperature closer to 0.0 for analytical / multiple choice, and closer to 1.0 for creative and generative tasks.
/// Note that even with temperature of 0.0, the results will not be fully deterministic.
/// </summary>
[JsonPropertyName("temperature")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingDefault)]
public double? Temperature { get; set; }

/// <summary>
/// Use nucleus sampling.
/// In nucleus sampling, we compute the cumulative distribution over all the options for each subsequent token in decreasing probability order and cut it off once it reaches a particular probability specified by top_p.You should either alter temperature or top_p, but not both.
/// Recommended for advanced use cases only. You usually only need to use temperature.
/// </summary>
[JsonPropertyName("top_p")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingDefault)]
public double? TopP { get; set; }

/// <summary>
/// Only sample from the top K options for each subsequent token.
/// Used to remove "long tail" low probability responses.
/// Recommended for advanced use cases only. You usually only need to use temperature.
/// </summary>
[JsonPropertyName("top_k")]
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingDefault)]
public double? TopK { get; set; }
}
Loading
Loading