-
Notifications
You must be signed in to change notification settings - Fork 6.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
the Fooocus keeps just shutting down mid startup... #1657
Comments
Thank you for the terminal log. Please double check your swap configuration. |
Is there a possibility I can make it work on 8 gb RAM? i have gtx 1660ti and 8 Gb of ram I know the requirement is 16 but I was just wondering if it is possible |
As of https://github.com/lllyasviel/Fooocus?tab=readme-ov-file#minimal-requirement it should work with 8GB RAM. Where did you read the information for 16GB? |
Having a similar issue, here is my console log:
As soon as I hit Generate, my 32GB ram starts to fill up, when it gets to ~31gb filled, VRAM starts to fill up and when it gets to ~11/16 GB it crashes with the above error. GPU is RX 7800 XT all drivers updated |
I figured out my issue: So last week I was with 16GB of memory with default swap settings which introduced a ~15gb page file. Now I am with 32GB and this started happening, however, the pagefile is still ~15GB. |
Read Troubleshoot
[x] I admit that I have read the Troubleshoot before making this issue.
Describe the problem
I do not know what I have to do to fix this I have followed all steps of troubleshooting as well as system swap but still nothing .
Full Console Log
Paste full console log here. You will make our job easier if you give a full log.
C:\Users\gabic\Downloads\Fooocus_win64_2-1-831>.\python_embeded\python.exe -s Fooocus\entry_with_update.py
Fast-forward merge
Update succeeded.
[System ARGV] ['Fooocus\entry_with_update.py']
Python 3.10.9 (tags/v3.10.9:1dd9be6, Dec 6 2022, 20:01:21) [MSC v.1934 64 bit (AMD64)]
Fooocus version: 2.1.856
Running on local URL: http://127.0.0.1:7865
C:\Users\gabic\Downloads\Fooocus_win64_2-1-831>.\python_embeded\python.exe -s Fooocus\entry_with_update.py
Fast-forward merge
Update succeeded.
[System ARGV] ['Fooocus\entry_with_update.py']
Python 3.10.9 (tags/v3.10.9:1dd9be6, Dec 6 2022, 20:01:21) [MSC v.1934 64 bit (AMD64)]
Fooocus version: 2.1.856
Running on local URL: http://127.0.0.1:7865
To create a public link, set
share=True
inlaunch()
.Total VRAM 6144 MB, total RAM 8017 MB
Set vram state to: NORMAL_VRAM
Always offload VRAM
Device: cuda:0 NVIDIA GeForce GTX 1660 Ti : native
VAE dtype: torch.float32
Using pytorch cross attention
Refiner unloaded.
model_type EPS
UNet ADM Dimension 2816
Using pytorch attention in VAE
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
Using pytorch attention in VAE
C:\Users\gabic\Downloads\Fooocus_win64_2-1-831>pause
Press any key to continue . . .
The text was updated successfully, but these errors were encountered: