Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trouble getting started with electron #336

Closed
tao opened this issue Sep 28, 2023 · 2 comments · Fixed by #342
Closed

Trouble getting started with electron #336

tao opened this issue Sep 28, 2023 · 2 comments · Fixed by #342
Labels
bug Something isn't working

Comments

@tao
Copy link

tao commented Sep 28, 2023

I'm trying to get started as a web developer with this stuff, and I've been trying to get it working in Electron but I'm running into some issues.

I know the tutorial for Electron is still a work in progress but I've run into so many problems I thought I'd ask for help and at the least it might give you some tips to help write the tutorial for other web developers... cause I feel like an idiot trying to get python and onnx and everything working.

Describe the bug

So anyway I cloned the electron example but when I try run it on my Mac I get this issue:

> [email protected] start
> electron-forge start

✔ Checking your system
✔ Locating application
✔ Loading configuration
✔ Preparing native dependencies: 1 / 1 [0.1s]
✔ Running generateAssets hook

(node:22988) UnhandledPromiseRejectionWarning: Error: `local_files_only=true` or `env.allowRemoteModels=false` and file was not found locally at "/Users/tao/Code/transformers.js/examples/electron/node_modules/@xenova/transformers/models/distilbert-base-uncased-finetuned-sst-2-english/tokenizer.json".
    at getModelFile (file:///Users/tao/Code/transformers.js/examples/electron/node_modules/@xenova/transformers/src/utils/hub.js:459:27)
    at async getModelJSON (file:///Users/tao/Code/transformers.js/examples/electron/node_modules/@xenova/transformers/src/utils/hub.js:542:18)
    at async Promise.all (index 0)
    at async loadTokenizer (file:///Users/tao/Code/transformers.js/examples/electron/node_modules/@xenova/transformers/src/tokenizers.js:52:16)
    at async AutoTokenizer.from_pretrained (file:///Users/tao/Code/transformers.js/examples/electron/node_modules/@xenova/transformers/src/tokenizers.js:3826:48)
    at async Promise.all (index 0)
    at async loadItems (file:///Users/tao/Code/transformers.js/examples/electron/node_modules/@xenova/transformers/src/pipelines.js:2193:5)
    at async pipeline (file:///Users/tao/Code/transformers.js/examples/electron/node_modules/@xenova/transformers/src/pipelines.js:2139:19)
    at async /Users/tao/Code/transformers.js/examples/electron/src/model.js:18:17

This is just the first issue I'm trying to deal with to get the example working before I look at my own electron project.

Obviously the issue with this one is that it's looking for the files in the wrong place, under /node_modules/@xenova/transformers/models instead of /models. So I tried downloading the model from huggingface but that didn't work... I realised the model needs to be transformed with onnx to work and look like this:

distilbert-base-uncased-finetuned-sst-2-english/
├── config.json
├── tokenizer.json
├── tokenizer_config.json
└── onnx/
    ├── model.onnx
    └── model_quantized.onnx

So now I've got to try figure out how to install onnx and use python... so I'm still learning but maybe there's an easier way to find or download these pre-converted onnx models for us python noobs to use.

After that I spent some time trying to install onnx, and I had to figure out where the scripts.convert file was and I think I converted it correctly with this command:

python3 -m scripts.convert --quantize --model_id distilbert-base-uncased-finetuned-sst-2-english

Then I renamed that file from model.onnx to model_quantized.onnx and ran it again without the --quantize flag?

So at this point hopefully I'm following along correctly, I have no idea.

I ran the code again and found the same issue:

(node:26959) UnhandledPromiseRejectionWarning: Error: `local_files_only=true` or `env.allowRemoteModels=false` and file was not found locally at "/Users/tao/Code/transformers.js/examples/electron/node_modules/@xenova/transformers/models/distilbert-base-uncased-finetuned-sst-2-english/tokenizer.json".

So then I copied the new onnx model into the node_modules folder again and that seemed to work 🔥😓 but obviously not ideal so we need a work around to get it to read models in the root models/ folder right?

The only issue left in the node console was this:

[27048:0928/160021.966025:ERROR:CONSOLE(1)] "Uncaught (in promise) TypeError: Failed to fetch", source: devtools://devtools/bundled/panels/elements/elements.js (1)

But that doesn't seem serious so not too worried about it for now, and if I disable the Electron dev tools it goes away anyway.

So now the example project works and I've made some progress, and I'm trying to run the model in my own Electron project... I started a new Electron project with Electron Forge 6.

The first issue I ran into there was that it couldn't load the onnxruntime_binding.node module:

> electron-forge start

✔ Checking your system
✔ Locating application
✔ Loading configuration
✔ Preparing native dependencies [0.2s]
✔ Running generateAssets hook
✔ [plugin-webpack] Compiling main process code [1s]
✔ [plugin-webpack] Launching dev servers for renderer process code [0.1s]
  › Output Available: http://localhost:9000

(node:27482) UnhandledPromiseRejectionWarning: Error: Cannot find module '../bin/napi-v3/darwin/arm64/onnxruntime_binding.node'
Require stack:
- /Users/tao/Code/temp/electron/.webpack/main/vendors-node_modules_xenova_transformers_src_models_js-node_modules_xenova_transformers_src_t-98979f.index.js
- /Users/tao/Code/temp/electron/.webpack/main/index.js
- /Users/tao/Code/temp/electron/node_modules/electron/dist/Electron.app/Contents/Resources/default_app.asar/main.js
- 
    at node:internal/modules/cjs/loader:1084:15
    at Function._resolveFilename (node:electron/js2c/browser_init:2:117419)
    at node:internal/modules/cjs/loader:929:27

There were a few other Github issues related to that and NextJS so I updated the webpack config to look like this:

// web pack.main.config.ts

import type { Configuration } from 'webpack';

import { rules } from './webpack.rules';

export const mainConfig: Configuration = {
  /**
   * This is the main entry point for your application, it's the first file
   * that runs in the main process.
   */
  entry: './src/index.ts',
  // Put your normal webpack config below here
  module: {
    rules,
  },
  resolve: {
    extensions: ['.js', '.ts', '.jsx', '.tsx', '.css', '.json'],
    alias: {
      "sharp$": false,
      "onnxruntime-node$": false,
    }
  },
};

Now I think the browser is ignoring the onnxruntime node module because that's only required on the node server... still trying to figure things out, anyway I think that worked so now I've got another issue:

> electron-forge start

✔ Checking your system
✔ Locating application
✔ Loading configuration
✔ Preparing native dependencies [0.1s]
✔ Running generateAssets hook
✔ [plugin-webpack] Compiling main process code [0.7s]
✔ [plugin-webpack] Launching dev servers for renderer process code [0.1s]
  › Output Available: http://localhost:9000

(node:27641) UnhandledPromiseRejectionWarning: TypeError: Cannot read properties of undefined (reading 'wasm')
    at ./node_modules/@xenova/transformers/src/env.js (/Users/tao/Code/temp/electron/.webpack/main/vendors-node_modules_xenova_transformers_src_models_js-node_modules_xenova_transformers_src_t-98979f.index.js:1269:10)
    at __webpack_require__ (/Users/tao/Code/temp/electron/.webpack/main/index.js:1101:41)
    at ./node_modules/@xenova/transformers/src/utils/hub.js (/Users/tao/Code/temp/electron/.webpack/main/vendors-node_modules_xenova_transformers_src_models_js-node_modules_xenova_transformers_src_t-98979f.index.js:15268:65)
    at __webpack_require__ (/Users/tao/Code/temp/electron/.webpack/main/index.js:1101:41)
    at ./node_modules/@xenova/transformers/src/tokenizers.js (/Users/tao/Code/temp/electron/.webpack/main/vendors-node_modules_xenova_transformers_src_models_js-node_modules_xenova_transformers_src_t-98979f.index.js:9420:71)
    at __webpack_require__ (/Users/tao/Code/temp/electron/.webpack/main/index.js:1101:41)
    at ./node_modules/@xenova/transformers/src/pipelines.js (/Users/tao/Code/temp/electron/.webpack/main/vendors-node_modules_xenova_transformers_src_models_js-node_modules_xenova_transformers_src_t-98979f.index.js:5678:72)
    at __webpack_require__ (/Users/tao/Code/temp/electron/.webpack/main/index.js:1101:41)
    at ./node_modules/@xenova/transformers/src/transformers.js (/Users/tao/Code/temp/electron/.webpack/main/vendors-node_modules_xenova_transformers_src_models_js-node_modules_xenova_transformers_src_t-98979f.index.js:13577:71)
    at Function.__webpack_require__ (/Users/tao/Code/temp/electron/.webpack/main/index.js:1101:41)
(Use `Electron --trace-warnings ...` to show where the warning was created)
(node:27641) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag `--unhandled-rejections=strict` (see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)

Now I'm a bit stuck and not really sure what to do next... I haven't done anything funny except follow the Electron tutorial until I got the ping pong example to work with the contextBridge, and then copied over the code from the Electron demo but it's running into some issue even thought it's very similar?

I was able to get the Whisper Web demo working in NextJS app so I have been playing around with the code quite a lot, and trying to get it to an Electron app next.

How to reproduce
Steps or a minimal working example to reproduce the behavior

I tried to recreate this bug in a demo and I think it's a webpack issue now. When I create a default Electron Forge app and copy over the example code then it seems to work.

Demo link

You just need to add the onnx models to the node_modules directory, which is probably a bug right, but this will work.

So here's the exact same but using the Electron webpack template, and we get the wasm not defined error:

Webpack demo link

So I tried out the Electron Vite template too but got stuck with the previous bug:

UnhandledPromiseRejectionWarning: Error: Could not dynamically require "../bin/napi-v3/darwin/arm64/onnxruntime_binding.node". Please configure the dynamicRequireTargets or/and ignoreDynamicRequires option of @rollup/plugin-commonjs appropriately for this require call to work.

So there's a demo for that too but not sure how to overcome that step yet.

I'd like to get Electron working with webpack so I can use React, if there's any way to help me figure out what's going on with this wasm bug but I think it's in the transformers code because it's trying to access it as a variable? like onnx_env.wasm.wasmPaths but it's probably undefined?? Maybe because I didn't solve the onnxruntime-node issue correctly earlier?

Environment

  • Transformers.js version: 2.6.1
  • Browser (if applicable): Electron (Chrome)
  • Operating system (if applicable): MacOS Ventura 13.4.1
@tao tao added the bug Something isn't working label Sep 28, 2023
@xenova
Copy link
Collaborator

xenova commented Oct 2, 2023

Hi there! Thanks so much for your extensive testing and analysis. Admittedly, I wrote the electron example ~6 months ago and I haven't even written the tutorial for it (so it does need an update!

Regarding the first issue (unable to find the models), I have modified the template to not require that the user first downloads the model (since Transformers.js now has node caching). See here for more information.

@tao
Copy link
Author

tao commented Oct 2, 2023

Okay, I installed the pull request and it seems like it's working.

$ npm uninstall @xenova/transformers
$ npm install xenova/transformers.js#pull/342/head

It seems like when you run the code it downloads the models to the .cache directory in the transformers node_modules folders... which is cool. So just the other problem of getting the webpack / react thing to work inside of electron but this solves the original issue which is great.

I can play with it some more and create another issue / pull request if I make some more progress on it. It would just be great to get whisper running locally in electron cause I could build some cool things with that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants