Skip to content

Commit

Permalink
chore(internal): codegen related update (#32)
Browse files Browse the repository at this point in the history
  • Loading branch information
stainless-app[bot] authored and stainless-bot committed Aug 12, 2024
1 parent 52c8005 commit edd1691
Show file tree
Hide file tree
Showing 18 changed files with 96 additions and 100 deletions.
2 changes: 2 additions & 0 deletions .github/workflows/release-doctor.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
name: Release Doctor
on:
pull_request:
branches:
- main
workflow_dispatch:

jobs:
Expand Down
32 changes: 16 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@ The full API of this library can be found in [api.md](api.md).
```js
import Together from 'together-ai';

const together = new Together({
const client = new Together({
apiKey: process.env['TOGETHER_API_KEY'], // This is the default and can be omitted
});

async function main() {
const chatCompletion = await together.chat.completions.create({
const chatCompletion = await client.chat.completions.create({
messages: [{ role: 'user', content: 'Say this is a test!' }],
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
});
Expand All @@ -45,9 +45,9 @@ We provide support for streaming responses using Server Sent Events (SSE).
```ts
import Together from 'together-ai';

const together = new Together();
const client = new Together();

const stream = await together.chat.completions.create({
const stream = await client.chat.completions.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
stream: true,
Expand All @@ -68,7 +68,7 @@ This library includes TypeScript definitions for all request params and response
```ts
import Together from 'together-ai';

const together = new Together({
const client = new Together({
apiKey: process.env['TOGETHER_API_KEY'], // This is the default and can be omitted
});

Expand All @@ -77,7 +77,7 @@ async function main() {
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
};
const chatCompletion: Together.Chat.ChatCompletion = await together.chat.completions.create(params);
const chatCompletion: Together.Chat.ChatCompletion = await client.chat.completions.create(params);
}

main();
Expand All @@ -94,7 +94,7 @@ a subclass of `APIError` will be thrown:
<!-- prettier-ignore -->
```ts
async function main() {
const chatCompletion = await together.chat.completions
const chatCompletion = await client.chat.completions
.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
Expand Down Expand Up @@ -137,12 +137,12 @@ You can use the `maxRetries` option to configure or disable this:
<!-- prettier-ignore -->
```js
// Configure the default for all requests:
const together = new Together({
const client = new Together({
maxRetries: 0, // default is 2
});

// Or, configure per-request:
await together.chat.completions.create({ messages: [{ role: 'user', content: 'Say this is a test' }], model: 'mistralai/Mixtral-8x7B-Instruct-v0.1' }, {
await client.chat.completions.create({ messages: [{ role: 'user', content: 'Say this is a test' }], model: 'mistralai/Mixtral-8x7B-Instruct-v0.1' }, {
maxRetries: 5,
});
```
Expand All @@ -154,12 +154,12 @@ Requests time out after 1 minute by default. You can configure this with a `time
<!-- prettier-ignore -->
```ts
// Configure the default for all requests:
const together = new Together({
const client = new Together({
timeout: 20 * 1000, // 20 seconds (default is 1 minute)
});

// Override per-request:
await together.chat.completions.create({ messages: [{ role: 'user', content: 'Say this is a test' }], model: 'mistralai/Mixtral-8x7B-Instruct-v0.1' }, {
await client.chat.completions.create({ messages: [{ role: 'user', content: 'Say this is a test' }], model: 'mistralai/Mixtral-8x7B-Instruct-v0.1' }, {
timeout: 5 * 1000,
});
```
Expand All @@ -178,9 +178,9 @@ You can also use the `.withResponse()` method to get the raw `Response` along wi

<!-- prettier-ignore -->
```ts
const together = new Together();
const client = new Together();

const response = await together.chat.completions
const response = await client.chat.completions
.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
Expand All @@ -189,7 +189,7 @@ const response = await together.chat.completions
console.log(response.headers.get('X-My-Header'));
console.log(response.statusText); // access the underlying Response object

const { data: chatCompletion, response: raw } = await together.chat.completions
const { data: chatCompletion, response: raw } = await client.chat.completions
.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
Expand Down Expand Up @@ -295,12 +295,12 @@ import http from 'http';
import { HttpsProxyAgent } from 'https-proxy-agent';

// Configure the default for all requests:
const together = new Together({
const client = new Together({
httpAgent: new HttpsProxyAgent(process.env.PROXY_URL),
});

// Override per-request:
await together.chat.completions.create(
await client.chat.completions.create(
{
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
Expand Down
13 changes: 1 addition & 12 deletions bin/check-release-environment
Original file line number Diff line number Diff line change
@@ -1,20 +1,9 @@
#!/usr/bin/env bash

warnings=()
errors=()

if [ -z "${NPM_TOKEN}" ]; then
warnings+=("The TOGETHER_NPM_TOKEN secret has not been set. Please set it in either this repository's secrets or your organization secrets")
fi

lenWarnings=${#warnings[@]}

if [[ lenWarnings -gt 0 ]]; then
echo -e "Found the following warnings in the release environment:\n"

for warning in "${warnings[@]}"; do
echo -e "- $warning\n"
done
errors+=("The TOGETHER_NPM_TOKEN secret has not been set. Please set it in either this repository's secrets or your organization secrets")
fi

lenErrors=${#errors[@]}
Expand Down
20 changes: 18 additions & 2 deletions bin/publish-npm
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,24 @@

set -eux

npm config set //registry.npmjs.org/:_authToken $NPM_TOKEN
npm config set '//registry.npmjs.org/:_authToken' "$NPM_TOKEN"

# Build the project
yarn build

# Navigate to the dist directory
cd dist
yarn publish --access public

# Get the version from package.json
VERSION="$(node -p "require('./package.json').version")"

# Extract the pre-release tag if it exists
if [[ "$VERSION" =~ -([a-zA-Z]+) ]]; then
# Extract the part before any dot in the pre-release identifier
TAG="${BASH_REMATCH[1]}"
else
TAG="latest"
fi

# Publish with the appropriate tag
yarn publish --access public --tag "$TAG"
3 changes: 1 addition & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,7 @@
"agentkeepalive": "^4.2.1",
"form-data-encoder": "1.7.2",
"formdata-node": "^4.3.2",
"node-fetch": "^2.6.7",
"web-streams-polyfill": "^3.2.1"
"node-fetch": "^2.6.7"
},
"devDependencies": {
"@swc/core": "^1.3.102",
Expand Down
4 changes: 2 additions & 2 deletions scripts/mock
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ echo "==> Starting mock server with URL ${URL}"

# Run prism mock on the given spec
if [ "$1" == "--daemon" ]; then
npm exec --package=@stoplight/prism-cli@~5.8 -- prism mock "$URL" &> .prism.log &
npm exec --package=@stainless-api/[email protected].4 -- prism mock "$URL" &> .prism.log &

# Wait for server to come online
echo -n "Waiting for server"
Expand All @@ -37,5 +37,5 @@ if [ "$1" == "--daemon" ]; then

echo
else
npm exec --package=@stoplight/prism-cli@~5.8 -- prism mock "$URL"
npm exec --package=@stainless-api/[email protected].4 -- prism mock "$URL"
fi
4 changes: 1 addition & 3 deletions src/_shims/node-runtime.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,7 @@ import { Readable } from 'node:stream';
import { type RequestOptions } from '../core';
import { MultipartBody } from './MultipartBody';
import { type Shims } from './registry';

// @ts-ignore (this package does not have proper export maps for this export)
import { ReadableStream } from 'web-streams-polyfill/dist/ponyfill.es2018.js';
import { ReadableStream } from 'node:stream/web';

type FileFromPathOptions = Omit<FilePropertyBag, 'lastModified'>;

Expand Down
1 change: 1 addition & 0 deletions src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -143,6 +143,7 @@ export class Together extends Core.APIClient {
}

static Together = this;
static DEFAULT_TIMEOUT = 60000; // 1 minute

static TogetherError = Errors.TogetherError;
static APIError = Errors.APIError;
Expand Down
8 changes: 4 additions & 4 deletions src/resources/chat/completions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -114,20 +114,20 @@ export namespace ChatCompletionChunk {

index: number;

logprobs?: CompletionsAPI.LogProbs;
logprobs?: number | null;
}

export namespace Choice {
export interface Delta {
role: 'system' | 'user' | 'assistant' | 'function' | 'tool';

content?: string | null;

/**
* @deprecated
*/
function_call?: Delta.FunctionCall | null;

role?: 'system' | 'user' | 'assistant' | 'function' | 'tool';

token_id?: number;

tool_calls?: Array<CompletionsAPI.ToolChoice>;
Expand Down Expand Up @@ -282,7 +282,7 @@ export interface CompletionCreateParamsBase {
max_tokens?: number;

/**
* A number between 0 and 1 that can be used as an alternative to temperature.
* A number between 0 and 1 that can be used as an alternative to top_p and top-k.
*/
min_p?: number;

Expand Down
2 changes: 1 addition & 1 deletion src/resources/completions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -163,7 +163,7 @@ export interface CompletionCreateParamsBase {
max_tokens?: number;

/**
* A number between 0 and 1 that can be used as an alternative to temperature.
* A number between 0 and 1 that can be used as an alternative to top-p and top-k.
*/
min_p?: number;

Expand Down
18 changes: 9 additions & 9 deletions tests/api-resources/chat/completions.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,18 +3,18 @@
import Together from 'together-ai';
import { Response } from 'node-fetch';

const together = new Together({
const client = new Together({
apiKey: 'My API Key',
baseURL: process.env['TEST_API_BASE_URL'] ?? 'http://127.0.0.1:4010',
});

describe('resource completions', () => {
test('create: only required params', async () => {
const responsePromise = together.chat.completions.create({
const responsePromise = client.chat.completions.create({
messages: [
{ role: 'system', content: 'string' },
{ role: 'system', content: 'string' },
{ role: 'system', content: 'string' },
{ role: 'system', content: 'content' },
{ role: 'system', content: 'content' },
{ role: 'system', content: 'content' },
],
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
});
Expand All @@ -28,11 +28,11 @@ describe('resource completions', () => {
});

test('create: required and optional params', async () => {
const response = await together.chat.completions.create({
const response = await client.chat.completions.create({
messages: [
{ role: 'system', content: 'string' },
{ role: 'system', content: 'string' },
{ role: 'system', content: 'string' },
{ role: 'system', content: 'content' },
{ role: 'system', content: 'content' },
{ role: 'system', content: 'content' },
],
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
echo: true,
Expand Down
6 changes: 3 additions & 3 deletions tests/api-resources/completions.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,14 @@
import Together from 'together-ai';
import { Response } from 'node-fetch';

const together = new Together({
const client = new Together({
apiKey: 'My API Key',
baseURL: process.env['TEST_API_BASE_URL'] ?? 'http://127.0.0.1:4010',
});

describe('resource completions', () => {
test('create: only required params', async () => {
const responsePromise = together.completions.create({
const responsePromise = client.completions.create({
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
prompt: '<s>[INST] What is the capital of France? [/INST]',
});
Expand All @@ -24,7 +24,7 @@ describe('resource completions', () => {
});

test('create: required and optional params', async () => {
const response = await together.completions.create({
const response = await client.completions.create({
model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',
prompt: '<s>[INST] What is the capital of France? [/INST]',
echo: true,
Expand Down
6 changes: 3 additions & 3 deletions tests/api-resources/embeddings.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,14 @@
import Together from 'together-ai';
import { Response } from 'node-fetch';

const together = new Together({
const client = new Together({
apiKey: 'My API Key',
baseURL: process.env['TEST_API_BASE_URL'] ?? 'http://127.0.0.1:4010',
});

describe('resource embeddings', () => {
test('create: only required params', async () => {
const responsePromise = together.embeddings.create({
const responsePromise = client.embeddings.create({
input: 'Our solar system orbits the Milky Way galaxy at about 515,000 mph',
model: 'togethercomputer/m2-bert-80M-8k-retrieval',
});
Expand All @@ -24,7 +24,7 @@ describe('resource embeddings', () => {
});

test('create: required and optional params', async () => {
const response = await together.embeddings.create({
const response = await client.embeddings.create({
input: 'Our solar system orbits the Milky Way galaxy at about 515,000 mph',
model: 'togethercomputer/m2-bert-80M-8k-retrieval',
});
Expand Down
Loading

0 comments on commit edd1691

Please sign in to comment.