-
Notifications
You must be signed in to change notification settings - Fork 322
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
converting of .safetensors not working #92
Comments
It seems that I have already found the problem. In the last pull request, which is almost ready to merge, the converter bug is fixed. |
@FSSRepo Thank you for your hard work, Can the conversion method be extracted separately as an API? |
@Cyberhan123 At the moment, the current version of the project is highly limited in exposing functions externally. Significant refactoring and testing would be required. Honestly, I don't currently have the necessary time. |
If by "last pull request", you mean #88, it compiles but it's still broken as the original report describes |
Could you give me more details, model name, where you downloaded it, try to build it from scratch (clean cmake cache), also the program's output. |
thanks, currently testing it, it loads the .safetensors file. |
Have you used the cuda version of sd.cpp? |
not yet, thanks for mentioning it. will try this out soon. |
The weights are StableDiffision v2 downloaded from the link provided in the README.
|
@Mek101 Could you pull the latest code, recompile, and give it a try? When running, could you add the -v parameter to check the output? If you don't mind, could you provide the md5sum of 'v2-1_768-nonema-pruned.safetensors'? I'd like to compare it with the file I have locally. |
|
I tried the |
It looks like you may have downloaded the wrong or incomplete file. e43c2fb4baa9c30988a8d9e8ee644a33 v2-1_768-nonema-pruned.safetensors |
This is probably because I gave the wrong url to the curl command for v2-1_768-nonema-pruned.safetensors in the documentation. It's fixed, you can try it again. curl -L -O https://huggingface.co/stabilityai/stable-diffusion-2-1/resolve/main/v2-1_768-nonema-pruned.safetensors |
hey guys, i tired to convert sdxl turbo fp16 to 5.1 gguf and got the same error as above mentioned [if i remember correctly] the model got converted but it gave me those errors after generating image, i need to do this because the model became 3gb only and it is very ram friendly this way instead of converting on fly in sd cpp |
i am trying to convert the v1-5-pruned-emaonly.safetensors but the file generated is not working.
and then
The text was updated successfully, but these errors were encountered: