Skip to content

multi thread error on onnxruntime-node 1.21.0 #1292

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
3 of 5 tasks
wszgrcy opened this issue Apr 23, 2025 · 1 comment
Open
3 of 5 tasks

multi thread error on onnxruntime-node 1.21.0 #1292

wszgrcy opened this issue Apr 23, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@wszgrcy
Copy link

wszgrcy commented Apr 23, 2025

System Info

windows 10 / transformers 3.5.0

Environment/Platform

  • Website/web-app
  • Browser extension
  • Server-side (e.g., Node.js, Deno, Bun)
  • Desktop app (e.g., Electron)
  • Other (e.g., VSCode extension)

Description

microsoft/onnxruntime#24486

I found that after upgrading onnxruntime, multithreading will encounter exceptions, which may affect transformers
Should dependency be downgraded? Version 1.20.1 works normally

Reproduction

worker

import { createRequire } from "node:module";
import path from "node:path";
import url from "node:url";
globalThis.require = createRequire(import.meta.url);
globalThis.__filename = url.fileURLToPath(import.meta.url);
globalThis.__dirname = path.dirname(__filename);


import  path2  from "path";
import * as ort from "onnxruntime-node";
import { parentPort } from "worker_threads";
async function test() {
  let dir = path2.join(process.cwd(), "bin/model.onnx");
  const session = await ort.InferenceSession.create(dir, {
    executionProviders: ["cpu"],
    executionMode: "parallel"
  });
  const dataA = Float32Array.from([1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12]);
  const dataB = Float32Array.from([
    10,
    20,
    30,
    40,
    50,
    60,
    70,
    80,
    90,
    100,
    110,
    120
  ]);
  const tensorA = new ort.Tensor("float32", dataA, [3, 4]);
  const tensorB = new ort.Tensor("float32", dataB, [4, 3]);
  const feeds = { a: tensorA, b: tensorB };
  const results = await session.run(feeds);
}
parentPort.on("message", async (value) => {
  console.log("go", value);
  test();
});

index

import path from 'path';
import { Worker } from 'worker_threads';

(() => {
  let url = path.join(__dirname, './worker.mjs');
  let instance = new Worker(url);
  instance.postMessage(1);
  let instance2 = new Worker(url);
  instance2.postMessage(1);
})();

model
https://github.com/microsoft/onnxruntime-inference-examples/blob/main/js/quick-start_onnxruntime-node/model.onnx

work on 1.20.1
failed on 1.21.1 1.21.0

@wszgrcy wszgrcy added the bug Something isn't working label Apr 23, 2025
@xenova
Copy link
Collaborator

xenova commented Apr 25, 2025

Thanks for the report. This will be fixed in the next version when we upgrade to onnxruntime-node 1.22.0 👍

See microsoft/onnxruntime#24486 (comment) for more information.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants