-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
H2O-GPT on AMD GPUs (ROCm) #1812
Comments
Can you share what you mean by it finds CUDA during install and fails? Maybe logs etc.? I adjusted one block In docs/linux_install.sh CUDA is mentioned. |
It should not be uninstalling ROCm-Torch `` Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).
note: This error originates from a subprocess, and is likely not a problem with pip. × Encountered error while generating package metadata. note: This is an issue with the package mentioned above, not pip. Attempting uninstall: torch |
Do we have an ROCm Docker image? |
We don't build one, but you can build one. |
Hi, How can we run H20-GPT on AMD-GPUs using the AMD ROCm libraries.
One can easily run an inference server on Ollama using ROCm thereby H2O-GPT needs to use this Ollama server for inferencing.
Problem: H2o-GPT install fails as it keeps finding CUDA during install. Some guidance here on editing the install script for ROCm would be helpful,
Method:
The text was updated successfully, but these errors were encountered: