Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🔥V3 dev publish @huggingface/transformers #852

Merged
merged 28 commits into from
Aug 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
31101c8
`@xenova/transformers` -> `@huggingface/transformers`
xenova Jul 17, 2024
e84322b
Override semver version
xenova Jul 17, 2024
6d3ea4b
fix bundler config for latest ORT
fs-eire Jul 25, 2024
487d8b2
Merge branch 'v3' into @huggingface/transformers
xenova Aug 7, 2024
1f6e0e1
Add prettier
xenova Aug 7, 2024
55494d1
prettier format config files
xenova Aug 7, 2024
5a68461
remove incorrect comment
xenova Aug 7, 2024
437cb34
Merge branch 'pr/864' into @huggingface/transformers
xenova Aug 7, 2024
5a6c926
Update onnxruntime-web version
xenova Aug 7, 2024
b19251b
Update webpack.config.js
xenova Aug 7, 2024
820c1e2
Fix copy path
xenova Aug 7, 2024
b0dab91
Run `npm ci`
xenova Aug 7, 2024
86b9b62
Fix bundling
xenova Aug 7, 2024
b326cc9
Merge branch 'v3' into @huggingface/transformers
xenova Aug 7, 2024
ca67092
Update `@webgpu/types`
xenova Aug 7, 2024
42076fd
Update SAM example
xenova Aug 7, 2024
48d3142
Use `??=` operator where possible
xenova Aug 7, 2024
3b1a4fd
Fix commonjs usage
xenova Aug 8, 2024
9a73b5e
Mark `onnxruntime-node` and `sharp` as externals
xenova Aug 8, 2024
9951aa5
Move `externals` into config
xenova Aug 8, 2024
c04d37e
Downgrade to onnxruntime 1.18.0
xenova Aug 8, 2024
d32fe2b
Finalize module/commonjs build
xenova Aug 8, 2024
1530d50
Separate web and node builds
xenova Aug 8, 2024
b4df0e2
[version] Update to 3.0.0-alpha.1
xenova Aug 8, 2024
ab59c51
Default to CDN-hosted .wasm files
xenova Aug 8, 2024
866b219
[version] Update to 3.0.0-alpha.2
xenova Aug 8, 2024
4a3398d
bump versions
xenova Aug 8, 2024
8891a14
[version] Update to 3.0.0-alpha.3
xenova Aug 8, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .prettierignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Ignore artifacts:
.github
dist
docs
examples
scripts
types
*.md
1 change: 1 addition & 0 deletions .prettierrc
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
{}
24 changes: 12 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,14 +11,14 @@
</p>

<p align="center">
<a href="https://www.npmjs.com/package/@xenova/transformers">
<img alt="NPM" src="https://img.shields.io/npm/v/@xenova/transformers">
<a href="https://www.npmjs.com/package/@huggingface/transformers">
<img alt="NPM" src="https://img.shields.io/npm/v/@huggingface/transformers">
</a>
<a href="https://www.npmjs.com/package/@xenova/transformers">
<img alt="NPM Downloads" src="https://img.shields.io/npm/dw/@xenova/transformers">
<a href="https://www.npmjs.com/package/@huggingface/transformers">
<img alt="NPM Downloads" src="https://img.shields.io/npm/dw/@huggingface/transformers">
</a>
<a href="https://www.jsdelivr.com/package/npm/@xenova/transformers">
<img alt="jsDelivr Hits" src="https://img.shields.io/jsdelivr/npm/hw/@xenova/transformers">
<a href="https://www.jsdelivr.com/package/npm/@huggingface/transformers">
<img alt="jsDelivr Hits" src="https://img.shields.io/jsdelivr/npm/hw/@huggingface/transformers">
</a>
<a href="https://github.com/xenova/transformers.js/blob/main/LICENSE">
<img alt="License" src="https://img.shields.io/github/license/xenova/transformers.js?color=blue">
Expand Down Expand Up @@ -69,7 +69,7 @@ out = pipe('I love transformers!')
<td>

```javascript
import { pipeline } from '@xenova/transformers';
import { pipeline } from '@huggingface/transformers';

// Allocate a pipeline for sentiment-analysis
let pipe = await pipeline('sentiment-analysis');
Expand All @@ -93,15 +93,15 @@ let pipe = await pipeline('sentiment-analysis', 'Xenova/bert-base-multilingual-u
## Installation


To install via [NPM](https://www.npmjs.com/package/@xenova/transformers), run:
To install via [NPM](https://www.npmjs.com/package/@huggingface/transformers), run:
```bash
npm i @xenova/transformers
npm i @huggingface/transformers
```

Alternatively, you can use it in vanilla JS, without any bundler, by using a CDN or static hosting. For example, using [ES Modules](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules), you can import the library with:
```html
<script type="module">
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/[email protected].0';
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@huggingface/[email protected].3';
</script>
```

Expand Down Expand Up @@ -134,12 +134,12 @@ Check out the Transformers.js [template](https://huggingface.co/new-space?templa



By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@xenova/[email protected].0/dist/), which should work out-of-the-box. You can customize this as follows:
By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@huggingface/[email protected].3/dist/), which should work out-of-the-box. You can customize this as follows:

### Settings

```javascript
import { env } from '@xenova/transformers';
import { env } from '@huggingface/transformers';

// Specify a custom location for models (defaults to '/models/').
env.localModelPath = '/path/to/models/';
Expand Down
12 changes: 6 additions & 6 deletions docs/scripts/build_readme.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,14 @@
</p>

<p align="center">
<a href="https://www.npmjs.com/package/@xenova/transformers">
<img alt="NPM" src="https://img.shields.io/npm/v/@xenova/transformers">
<a href="https://www.npmjs.com/package/@huggingface/transformers">
<img alt="NPM" src="https://img.shields.io/npm/v/@huggingface/transformers">
</a>
<a href="https://www.npmjs.com/package/@xenova/transformers">
<img alt="NPM Downloads" src="https://img.shields.io/npm/dw/@xenova/transformers">
<a href="https://www.npmjs.com/package/@huggingface/transformers">
<img alt="NPM Downloads" src="https://img.shields.io/npm/dw/@huggingface/transformers">
</a>
<a href="https://www.jsdelivr.com/package/npm/@xenova/transformers">
<img alt="jsDelivr Hits" src="https://img.shields.io/jsdelivr/npm/hw/@xenova/transformers">
<a href="https://www.jsdelivr.com/package/npm/@huggingface/transformers">
<img alt="jsDelivr Hits" src="https://img.shields.io/jsdelivr/npm/hw/@huggingface/transformers">
</a>
<a href="https://github.com/xenova/transformers.js/blob/main/LICENSE">
<img alt="License" src="https://img.shields.io/github/license/xenova/transformers.js?color=blue">
Expand Down
2 changes: 1 addition & 1 deletion docs/snippets/1_quick-tour.snippet
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ out = pipe('I love transformers!')
<td>

```javascript
import { pipeline } from '@xenova/transformers';
import { pipeline } from '@huggingface/transformers';

// Allocate a pipeline for sentiment-analysis
let pipe = await pipeline('sentiment-analysis');
Expand Down
6 changes: 3 additions & 3 deletions docs/snippets/2_installation.snippet
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@

To install via [NPM](https://www.npmjs.com/package/@xenova/transformers), run:
To install via [NPM](https://www.npmjs.com/package/@huggingface/transformers), run:
```bash
npm i @xenova/transformers
npm i @huggingface/transformers
```

Alternatively, you can use it in vanilla JS, without any bundler, by using a CDN or static hosting. For example, using [ES Modules](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules), you can import the library with:
```html
<script type="module">
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/[email protected].0';
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@huggingface/[email protected].3';
</script>
```
4 changes: 2 additions & 2 deletions docs/snippets/4_custom-usage.snippet
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@


By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@xenova/[email protected].0/dist/), which should work out-of-the-box. You can customize this as follows:
By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@huggingface/[email protected].3/dist/), which should work out-of-the-box. You can customize this as follows:

### Settings

```javascript
import { env } from '@xenova/transformers';
import { env } from '@huggingface/transformers';

// Specify a custom location for models (defaults to '/models/').
env.localModelPath = '/path/to/models/';
Expand Down
6 changes: 3 additions & 3 deletions docs/source/guides/node-audio-processing.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,11 +26,11 @@ This tutorial will be written as an ES module, but you can easily adapt it to us

## Getting started

Let's start by creating a new Node.js project and installing Transformers.js via [NPM](https://www.npmjs.com/package/@xenova/transformers):
Let's start by creating a new Node.js project and installing Transformers.js via [NPM](https://www.npmjs.com/package/@huggingface/transformers):

```bash
npm init -y
npm i @xenova/transformers
npm i @huggingface/transformers
```

<Tip>
Expand All @@ -52,7 +52,7 @@ npm i wavefile
Start by creating a new file called `index.js`, which will be the entry point for our application. Let's also import the necessary modules:

```js
import { pipeline } from '@xenova/transformers';
import { pipeline } from '@huggingface/transformers';
import wavefile from 'wavefile';
```

Expand Down
2 changes: 1 addition & 1 deletion docs/source/guides/private.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ Transformers.js will attach an Authorization header to requests made to the Hugg
One way to do this is to call your program with the environment variable set. For example, let's say you have a file called `llama.js` with the following code:

```js
import { AutoTokenizer } from '@xenova/transformers';
import { AutoTokenizer } from '@huggingface/transformers';

// Load tokenizer for a gated repository.
const tokenizer = await AutoTokenizer.from_pretrained('meta-llama/Llama-2-7b-hf');
Expand Down
2 changes: 1 addition & 1 deletion docs/source/pipelines.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ For the full list of available tasks/pipelines, check out [this table](#availabl
Start by creating an instance of `pipeline()` and specifying a task you want to use it for. For example, to create a sentiment analysis pipeline, you can do:

```javascript
import { pipeline } from '@xenova/transformers';
import { pipeline } from '@huggingface/transformers';

let classifier = await pipeline('sentiment-analysis');
```
Expand Down
12 changes: 6 additions & 6 deletions docs/source/tutorials/next.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,11 +42,11 @@ On installation, you'll see various prompts. For this demo, we'll be selecting t

### Step 2: Install and configure Transformers.js

You can install Transformers.js from [NPM](https://www.npmjs.com/package/@xenova/transformers) with the following command:
You can install Transformers.js from [NPM](https://www.npmjs.com/package/@huggingface/transformers) with the following command:


```bash
npm i @xenova/transformers
npm i @huggingface/transformers
```

We also need to update the `next.config.js` file to ignore node-specific modules when bundling for the browser:
Expand Down Expand Up @@ -76,7 +76,7 @@ module.exports = nextConfig
Next, we'll create a new [Web Worker](https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Using_web_workers) script where we'll place all ML-related code. This is to ensure that the main thread is not blocked while the model is loading and performing inference. For this application, we'll be using [`Xenova/distilbert-base-uncased-finetuned-sst-2-english`](https://huggingface.co/Xenova/distilbert-base-uncased-finetuned-sst-2-english), a ~67M parameter model finetuned on the [Stanford Sentiment Treebank](https://huggingface.co/datasets/sst) dataset. Add the following code to `./src/app/worker.js`:

```js
import { pipeline, env } from "@xenova/transformers";
import { pipeline, env } from "@huggingface/transformers";

// Skip local model check
env.allowLocalModels = false;
Expand Down Expand Up @@ -264,11 +264,11 @@ On installation, you'll see various prompts. For this demo, we'll be selecting t

### Step 2: Install and configure Transformers.js

You can install Transformers.js from [NPM](https://www.npmjs.com/package/@xenova/transformers) with the following command:
You can install Transformers.js from [NPM](https://www.npmjs.com/package/@huggingface/transformers) with the following command:


```bash
npm i @xenova/transformers
npm i @huggingface/transformers
```

We also need to update the `next.config.js` file to prevent Webpack from bundling certain packages:
Expand All @@ -294,7 +294,7 @@ Next, let's set up our Route Handler. We can do this by creating two files in a
1. `pipeline.js` - to handle the construction of our pipeline.

```js
import { pipeline } from "@xenova/transformers";
import { pipeline } from "@huggingface/transformers";

// Use the Singleton pattern to enable lazy construction of the pipeline.
// NOTE: We wrap the class in a function to prevent code duplication (see below).
Expand Down
10 changes: 5 additions & 5 deletions docs/source/tutorials/node.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,11 +31,11 @@ Although you can always use the [Python library](https://github.com/huggingface/

## Getting started

Let's start by creating a new Node.js project and installing Transformers.js via [NPM](https://www.npmjs.com/package/@xenova/transformers):
Let's start by creating a new Node.js project and installing Transformers.js via [NPM](https://www.npmjs.com/package/@huggingface/transformers):

```bash
npm init -y
npm i @xenova/transformers
npm i @huggingface/transformers
```

Next, create a new file called `app.js`, which will be the entry point for our application. Depending on whether you're using [ECMAScript modules](#ecmascript-modules-esm) or [CommonJS](#commonjs), you will need to do some things differently (see below).
Expand Down Expand Up @@ -66,7 +66,7 @@ import url from 'url';
Following that, let's import Transformers.js and define the `MyClassificationPipeline` class.

```javascript
import { pipeline, env } from '@xenova/transformers';
import { pipeline, env } from '@huggingface/transformers';

class MyClassificationPipeline {
static task = 'text-classification';
Expand Down Expand Up @@ -107,7 +107,7 @@ class MyClassificationPipeline {
static async getInstance(progress_callback = null) {
if (this.instance === null) {
// Dynamically import the Transformers.js library
let { pipeline, env } = await import('@xenova/transformers');
let { pipeline, env } = await import('@huggingface/transformers');

// NOTE: Uncomment this to change the cache directory
// env.cacheDir = './.cache';
Expand Down Expand Up @@ -195,7 +195,7 @@ Great! We've successfully created a basic HTTP server that uses Transformers.js

### Model caching

By default, the first time you run the application, it will download the model files and cache them on your file system (in `./node_modules/@xenova/transformers/.cache/`). All subsequent requests will then use this model. You can change the location of the cache by setting `env.cacheDir`. For example, to cache the model in the `.cache` directory in the current working directory, you can add:
By default, the first time you run the application, it will download the model files and cache them on your file system (in `./node_modules/@huggingface/transformers/.cache/`). All subsequent requests will then use this model. You can change the location of the cache by setting `env.cacheDir`. For example, to cache the model in the `.cache` directory in the current working directory, you can add:

```javascript
env.cacheDir = './.cache';
Expand Down
6 changes: 3 additions & 3 deletions docs/source/tutorials/react.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,10 +44,10 @@ You can stop the development server by pressing <kbd>Ctrl</kbd> + <kbd>C</kbd> i

## Step 2: Install and configure Transformers.js

Now we get to the fun part: adding machine learning to our application! First, install Transformers.js from [NPM](https://www.npmjs.com/package/@xenova/transformers) with the following command:
Now we get to the fun part: adding machine learning to our application! First, install Transformers.js from [NPM](https://www.npmjs.com/package/@huggingface/transformers) with the following command:

```bash
npm install @xenova/transformers
npm install @huggingface/transformers
```

For this application, we will use the [Xenova/nllb-200-distilled-600M](https://huggingface.co/Xenova/nllb-200-distilled-600M) model, which can perform multilingual translation among 200 languages. Before we start, there are 2 things we need to take note of:
Expand All @@ -58,7 +58,7 @@ We can achieve both of these goals by using a [Web Worker](https://developer.moz

1. Create a file called `worker.js` in the `src` directory. This script will do all the heavy-lifing for us, including loading and running of the translation pipeline. To ensure the model is only loaded once, we will create the `MyTranslationPipeline` class which use the [singleton pattern](https://en.wikipedia.org/wiki/Singleton_pattern) to lazily create a single instance of the pipeline when `getInstance` is first called, and use this pipeline for all subsequent calls:
```javascript
import { pipeline } from '@xenova/transformers';
import { pipeline } from '@huggingface/transformers';

class MyTranslationPipeline {
static task = 'translation';
Expand Down
2 changes: 1 addition & 1 deletion docs/source/tutorials/vanilla-js.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ The `type="module"` attribute is important, as it turns our file into a [JavaScr
Moving into `index.js`, let's import Transformers.js by adding the following line to the top of the file:

```js
import { pipeline, env } from "https://cdn.jsdelivr.net/npm/@xenova/transformers@2.6.0";
import { pipeline, env } from "https://cdn.jsdelivr.net/npm/@huggingface/transformers";
```

Since we will be downloading the model from the Hugging Face Hub, we can skip the local model check by setting:
Expand Down
1 change: 1 addition & 0 deletions examples/segment-anything-client/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
dist
2 changes: 1 addition & 1 deletion examples/segment-anything-client/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
"preview": "vite preview"
},
"dependencies": {
"@xenova/transformers": "^3.0.0"
"@huggingface/transformers": "^3.0.0-alpha.0"
},
"devDependencies": {
"vite": "^5.2.9"
Expand Down
20 changes: 16 additions & 4 deletions examples/segment-anything-client/vite.config.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,18 @@
import { defineConfig } from 'vite';
export default defineConfig({
build: {
target: 'esnext'
}
export default defineConfig(env => {
const config = {
build: {
target: 'esnext'
}
};

// TODO: Add this back when .wasm files are served locally
// if (env.mode === 'development') {
// // The .wasm files are not correctly served using Vite in development mode.
// // This is a workaround to exclude the onnxruntime-web package from Vite's optimization.
// // See also: https://github.com/vitejs/vite/issues/8427
// config.optimizeDeps = { exclude: ["onnxruntime-web"] };
// }

return config;
});
22 changes: 6 additions & 16 deletions examples/segment-anything-client/worker.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import { SamModel, AutoProcessor, RawImage, Tensor } from '@xenova/transformers';
import { SamModel, AutoProcessor, RawImage, Tensor } from '@huggingface/transformers';

// We adopt the singleton pattern to enable lazy-loading of the model and processor.
export class SegmentAnythingSingleton {
Expand All @@ -7,21 +7,11 @@ export class SegmentAnythingSingleton {
static processor;

static getInstance() {
if (!this.model) {
this.model = SamModel.from_pretrained(this.model_id, {
dtype: {
vision_encoder: 'fp16',
prompt_encoder_mask_decoder: 'q8',
},
device: {
vision_encoder: 'webgpu',
prompt_encoder_mask_decoder: 'wasm',
}
});
}
if (!this.processor) {
this.processor = AutoProcessor.from_pretrained(this.model_id);
}
this.model ??= SamModel.from_pretrained(this.model_id, {
dtype: 'fp16',
device: 'webgpu',
});
this.processor ??= AutoProcessor.from_pretrained(this.model_id);

return Promise.all([this.model, this.processor]);
}
Expand Down
6 changes: 2 additions & 4 deletions jest.config.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -121,9 +121,7 @@ export default {
// rootDir: undefined,

// A list of paths to directories that Jest should use to search for files in
roots: [
"./tests/"
],
roots: ["./tests/"],

// Allows you to use a custom runner instead of Jest's default test runner
// runner: "jest-runner",
Expand Down Expand Up @@ -170,7 +168,7 @@ export default {
// testRunner: "jest-circus/runner",

// A map from regular expressions to paths to transformers
transform: {}
transform: {},

// An array of regexp pattern strings that are matched against all source file paths, matched files will skip transformation
// transformIgnorePatterns: [
Expand Down
Loading
Loading