-
Notifications
You must be signed in to change notification settings - Fork 3.2k
[Web] WebGPU and WASM Backends Unavailable within Service Worker #20876
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Than you for reporting this issue. I will try to figure out how to fix this problem. |
So it turns out to be that dynamic import (ie. Currently, the WebAssembly factory (wasm-factory.ts) uses dynamic import to load the JS glue. This does not work in service worker. A few potential solutions are also not available:
I am now trying to make a JS bundle that does not use dynamic import for usage of service worker specifically. Still working on it |
Thanks, I appreciate your efforts around this. It does seem like some special-case bundle will need to be built after all; you might need |
I have considered this option. However, Emscripten does not offer an option to output both UMD(IIFE+CJS) & ESM for JS glue (emscripten-core/emscripten#21899). I have to choose either. I choose the ES6 format output for the JS glue, because of a couple of problems when import UMD from ESM, and I found a way to make ORT web working, - yes this need the build script to do some special handling. And this will only work for ESM, because the JS glue is ESM and it seems no way to import ESM from UMD in service worker. |
### Description <!-- Describe your changes. --> This PR allows to build ORT web to `ort{.all|.webgpu}.bundle.min.mjs`, which does not have any dynamic import. This makes it possible to use ort web via static import in service worker. Fixes #20876
@ggaabe Could you please help to try |
@fs-eire my project is dependent on transformersjs, which imports onnxruntime webgpu backend like this here: https://github.com/xenova/transformers.js/blob/v3/src/backends/onnx.js#L24 Is this the right usage? In my project I've added this to my package.json to resolve onnx-runtime to this new version though the issue is still occurring:
|
Maybe also important: The same error is still occurring in same spot in inference session in the onnx package and not from transformersjs. Do I need to add a resolver for onnxruntime-common as well? |
#20991 makes default ESM import to use non-dynamic-import and hope this change may fix this problem. PR is still in progress |
Hi @fs-eire, is the newly-merged fix in a released build I can try? |
Please try 1.19.0-dev.20240612-94aa21c3dd |
@fs-eire EDIT: Nvm the comment I just deleted, that error was because I didn't set the webpack However, I'm getting a new error now (progress!):
|
Update: Found the error is happening in here: onnxruntime/js/common/lib/backend-impl.ts Lines 83 to 86 in fff68c3
For some reason the webgpu backend.init promise is rejecting due to the |
Could you share me the reproduce steps? |
@fs-eire You'll need to run the webGPU setup in a chrome extension.
![]()
![]()
![]()
|
@ggaabe I did some debug on my box and made some fixes -
|
Awesome, thank you for your thoroughness in explaining this and tackling this head on. Is there a dev channel version I can test out? |
Not yet. Will update here once it is ready. |
sorry to bug; is there any dev build number? wasn't sure how often a release runs |
Please try 1.19.0-dev.20240621-69d522f4e9 |
@fs-eire I'm getting one new error:
I pushed the code changes to my repo and fixed the call to the tokenizer. To reproduce, just type 1 letter in the chrome extension’s text input and wait |
Hey, I also need this. I am struggling with importing this version. So far I have been importing ONNX using |
just replace |
I created #21534, which is a replacement of #21430:
|
### Description This PR adds a new option `ort.env.wasm.wasmBinary`, which allows user to set to a buffer containing preload .wasm file content. This PR should resolve the problem from latest discussion in #20876.
1.19.0-dev.20240801-4b8f6dcbb6 includes the change. |
To clarify, is the best way to go about running transformers.js with WebGPU for the onnxruntime to monkeypatch the package to make the necessary wasm stuff load in each service worker, a la @kyr0's Has anyone had luck / have tips for just running the v3 branch of Transformers.js? Or, maybe more precisely — do we know how something like Segment Anything WebGPU, which Xenova has in an HF Space, is working? Seems like there's been some official solution here but I can't find it documented / implemented well. |
I am working with Transformer.js to make v3 branch compatible with latest module system. This is one of the merged changes: huggingface/transformers.js#864. You probably need to use some workaround for now, but (hopefully) eventually you should be able to use it out of box. |
@lucasgelfond Now that the new updates from @fs-eire are in place, I'm probably able to streamline the workaround. I'll have a look soon, but as I'm on vacation right now, I cannot give an ETA, unfortunately. |
Has anyone tried getting these imports working in Vite/other bundlers? When I try the classic:
(which works in create-react-app), Vite says:
Anyways, I tried importing from url, a la
which Vite also doesn't like
I disabled SSR in Svelte but still seemingly no luck/change. I tried manually downloading the files with CURL, where I got an error about the lack of source map, so, I also downloaded .min.js.map. When I run it now, this works, but I get back to the original error in the thread about unavailable backends:
I figured it might work to just import directly, so I also tried:
but then I got Anyone have ideas of how to handle? Happy to add more verbose error messages for any of the stuff above. |
Could you share me a repo that I can reproduce the issue? I will take a look. |
@fs-eire you are amazing! https://github.com/lucasgelfond/webgpu-sam2 I swapped over to Webpack (in the svelte-webpack directory) but the original Vite version is in there. No immediate rush because I solved temporarily with Webpack, but Webpack breaks some other imports so would be awesome to move back—thanks so much again! |
👋 Thank you @fs-eire ! I tried using |
Doesn't it work if by just replacing |
It works indeed 😵 I tried doing |
So im still having this issue in 1.19.2. This is in the context of a chrome extension, mv3, This: onnxruntime/js/web/lib/wasm/wasm-factory.ts Line 119 in e91ff94
Calling this:
Seems to lead to this:
I realize the poster above me is running the same setup and has it working, but Im really not sure what to do differently. Using code from the test file, ive tried replicating it like so, but this doesnt seem to work:
|
If you are using 1.19.2 and still ran into this error, it is probably because your bundler imports onnxruntime-web as UMD. please verify the following:
|
Thanks so much for your help! The bundler was indeed the issue, for anyone reading this: I was using vite 4 and it was prefering the browser field in the package.json which led to the wrong file. Switching to vite 5 solves that issue as you can change the order of fields, even though it will by default already prefer the exports field. I have another issue now though: Now that the correct file has made it, I am getting this error:
any ideas what that might be? edit: edit#2: web worker is not available in service workers which run through the background script, hence cpu does not work there edit#3 should multithreading be possible in a chrome extension? ive got this So yeah if anyone has successfully used cpu multithreading in a chrome extension, doesnt matter how, please let me know. |
In my understanding, |
I have a follow-up issue that I struggle to understand: when using onnxruntime-web in a mv3 extension service-worker bundled with webpack. The symptoms are that the service worker will crash right away with this error:
Here is a small reproduction sandbox: (Download the dist folder to try it out in a web browser where you can load unpacked extensions - I tested in Chrome & firefox) There is a way to make it work using I am not sure if this requires a dedicated issue or if it's okay to follow up in this one, let me know @fs-eire |
It looks like that you are using jsonp for chunk loading in webpack. I don't think this will work in a service worker. I think the solution is to add module.exports = {
mode: "development",
devtool: false,
+ target: 'webworker',
entry: {
Then I can see another error, which is expected (because the model is an invalid dummy file):
|
Thanks for looking into it ! That could maybe be solved by generating a worker compatible only JS file through emscripten using the flag I will give it a try and let you know ! edit: I tried adding |
This is somehow broken again in the latest dev releases. For example in 1.21.0-dev.20250117-db8e10b0b9, whereas just changing it back to 1.21.0-dev.20241205-d27fecd3d3 works. |
Same problem here, how should I solve the |
More discussion here: #20991 (comment) |
Please refer to #20991 (comment), which contains latest solution and explanation for using ORT-web in service worker. |
I tried to use Emscripten to build multiple targets (emscripten-core/emscripten#21899), but it turns out difficult to do. Currently the onnxruntime-web build process has a few post-processing to make them working for Web, WebWorker and Node. It may also be a problem for how to tell the runtime or bundler to import the correct file, giving the predefined "conditions" in the package.json "exports" property do not contain anything that distinguish between web and webworker. |
I think you're right, since emscripten outputs a web compatible file and webpack by default uses the |
@fs-eire Actually have a different question, so I managed to get The issue im having is that when i run a larger inference, it will block all other system tasks and browser ui etc. As its running max performance on all cores. My goal: Have 1 or 2 reservered cores, so basically supplying
Having that work would allow me to let the browser ui/rest of processes breathe while the longer inference which can take like a min. Edit: I am creating the session like this to get the error:
|
The whole execution process is driven by CPU and the entry point of the inference execution is the wasm exported function If you are working on a product that has a UI responsive requirement, the solution is to use onnxruntime-web in a web worker and write communication logic between the worker and main thread. Regarding the number of threads setting, for onnxruntime-web, intraOpNumThreads is not available ( you can see the document saying that it's only for onnxruntime-node and onnxruntime-react-native ). You should use |
Uh oh!
There was an error while loading. Please reload this page.
Describe the issue
I'm running into issues trying to use the WebGPU or WASM backends inside of a ServiceWorker (on a chrome extension). More specifically, I'm attempting to use Phi-3 with transformers.js v3
Every time I attempt this, I get the following error:
This is originating in the
InferenceSession
class injs/common/lib/inference-session-impl.ts
.More specifically, it's happening in this method:
const [backend, optionsWithValidatedEPs] = await resolveBackendAndExecutionProviders(options);
where the implementation is in
js/common/lib/backend-impl.ts
and thetryResolveAndInitializeBackend
fails to initialize any of the execution providers.WebGPU is now supported in ServiceWorkers though; it is a recent change and it should be feasible. Here were the chrome release notes.
Additionally, here is an example browser extension from the mlc-ai/web-llm framework that implements WebGPU usage in service workers successfully:
https://github.com/mlc-ai/web-llm/tree/main/examples/chrome-extension-webgpu-service-worker
Here is some further discussion on this new support from Google itself:
https://groups.google.com/a/chromium.org/g/chromium-extensions/c/ZEcSLsjCw84/m/WkQa5LAHAQAJ
So technically I think it should be possible for this to be supported now? Unless I'm doing something else glaringly wrong. Is it possible to add support for this?
To reproduce
Download and set up the transformers.js extension example and put this into the background.js file:
Urgency
this would help enable a new ecosystem to build up around locally intelligent browser extensions and tooling.
it's urgent for me because it would be fun to build and I want to build it and it would be fun to be building it rather than not be building it.
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.19.0-dev.20240509-69cfcba38a
Execution Provider
'webgpu' (WebGPU)
The text was updated successfully, but these errors were encountered: