Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Error: The 'onnxruntime-gpu' distribution was not found and is required by the application #383

Closed
1 of 6 tasks
SunGreen777 opened this issue Feb 12, 2024 · 10 comments
Labels
question Further information is requested

Comments

@SunGreen777
Copy link

SunGreen777 commented Feb 12, 2024

Checklist

  • The issue exists after disabling all extensions
  • The issue exists on a clean installation of webui
  • The issue is caused by an extension, but I believe it is caused by a bug in the webui
  • The issue exists in the current version of the webui
  • The issue has not been reported before recently
  • The issue has been reported before but has not been fixed yet

What happened?

I have an error, does anyone know what it is?
Error: The 'onnxruntime-gpu' distribution was not found and is required by the application

image

Steps to reproduce the problem

What should have happened?

What browsers do you use to access the UI ?

No response

Sysinfo

Console logs

-

Additional information

No response

@waylaa
Copy link

waylaa commented Feb 12, 2024

Did you follow the steps from here?

After that, if you want ONNX to use DirectML, follow this

@lshqqytiger
Copy link
Owner

Add --skip-ort if you don't want ONNX.

@lshqqytiger lshqqytiger added the question Further information is requested label Feb 14, 2024
@SunGreen777
Copy link
Author

This is a little confusing, I don’t understand at all whether it is needed or not, and if it is better, faster, then why is it not enabled by default? Can explain please )

@lshqqytiger
Copy link
Owner

ONNX is slightly fast, Olive optimized ONNX model is much faster. But the overhead follows. (model conversion and optimization)

@SunGreen777
Copy link
Author

ONNX is slightly fast, Olive optimized ONNX model is much faster. But the overhead follows. (model conversion and optimization)

I'm understood, thank you :)

@vutym
Copy link

vutym commented Mar 5, 2024

open CMD as administrator or Consider using the --user option or check the permissions.

pip install onnxruntime-gpu --extra-index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/onnxruntime-cuda-12/pypi/simple/

@e2heintz
Copy link

e2heintz commented Mar 7, 2024

I had the same error. Partially managed to fix it, but only partially.

I tried using Forge yesterday and it wasn't working properly. Came back to Directml this morning and had lots of errors, ending up with this 'no onnx runtime' thing. Now webui bypassed both my GPU-s and generated on the CPU.

What I did:

  1. Deleted venv fom Directml folder : no help
  2. pip install torch-directml and onnxruntime-gpu : no help
  3. Completely deleted both the Forge folder and the Directml folder, then git pull, then added torch-directml to requirements, added cmd args --use-directml --skip-ort : somewhat working, but…

I cannot use --device-id. Getting ‘invalid string’. Which is very bad. I really really really need that other GPU with its clean vram untouched by the OS and all the apps.

@e2heintz
Copy link

e2heintz commented Mar 22, 2024

Update:
Uninstalled Python and Git. Deleted webui folder comlepetely. Restarted. Reinstalled Python and Git. Git pulled webui folder.
SD is working with --skip-torch-cuda-test --use-directml, but it is using the wrong GPU. I don't want it to use my display-out GPU, I want it to use my secondary GPU.
When adding --device-id, I get this error "'onnxruntime-gpu' distribution was not found' and "invalid device string".
--skip-ort does not help.

@luckyzsd1975
Copy link

image
已经安装了pip install onnxruntime,但在运行open-webui serve时还是报上面的错误,who knows the reason?

@Chr-Wol
Copy link

Chr-Wol commented Jul 16, 2024

get the same issue, have you got help to solve it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

7 participants