-
Notifications
You must be signed in to change notification settings - Fork 6.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Segfault on linux with AMD GPU #1783
Comments
Can you please check if it works without setting |
yeah, tried that already, same behavior without any arguments |
I have the same issue before and after creating the 40GB swap partition. It doesn't seem to be ram/memory related. Full logs:
|
I have a similar problem, it seems like the swap is not being used or found. Running normally:
Running cpu only:
Nvidia-smi, driver & cuda versions. (which should be compatible with the current torch version.
EDIT: I looked in the docs and in other issues for how to go about debugging this but that is not clear to me, i'd love to help contribute if there is some resources i can start with. |
I have same issue. (Ryzen R9 7900X RAM 64 GB of which 16GB of VRAM for integrated GPU). When I run on linux I have the same issue exactly. On windows (I dual boot), it runs more or less OK (I have sometimes a crash because of the memory leak issue but it works). Could it be linked to the version of rocm that is different in the instructions of linux/AMD (5.6 instead of 5.7) ? I have read that it does not go well with VAE version (?) I tried a manual upgrade of ROCM but it caused other problems. I found interresting on the same subject: https://www.reddit.com/r/comfyui/comments/15b8lxd/comfyui_is_not_detecting_my_gpus_vram/ |
Hello,
Working with Feel free to tell if I can do some test or provide more information. |
Hello, I had the same issue with the segfault on the following hardware:
I managed to make it run doing the following:
I can't test if it works on the RX 6000 series as my previous card is fried. Hope this helps 😄 |
sadly this does not work for me...
ok, could make it run with lowvram option, but it never finishes generation of any images |
Thank you @Athoir, but exactly same as @carnager, it run with
Complete steps to reproduce on Fedora / Nobara :
Perhaps related to torch version ? (see AUTOMATIC1111/stable-diffusion-webui#8139 (comment)) |
For RX 6700 XT, setting |
not for me....
and then nothing happens |
Working FAST with |
Also RX 6700 XT user, using HSA_OVERRIDE_GFX_VERSION=10.3.0 helped |
I confirm, 6600 XT, this solve the problem |
I'm Runing here without no problems using this GIST: |
Read Troubleshoot
[x] I admit that I have read the Troubleshoot before making this issue.
Describe the problem
I installed fooocus on linux using the instructions on the main page. I uninstalled regular torch and installed the amd version as mentioned on front page. I created a 40GB swap space and then ran the app with
python launch.py --attention-split
When i try to issue a image generation it seems to do something but then segfaults.
some info about my setup:
Full Console Log
The text was updated successfully, but these errors were encountered: