Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Curious if there's a way to use GGUF models? #99

Open
wilvancleve opened this issue Nov 20, 2024 · 1 comment
Open

Curious if there's a way to use GGUF models? #99

wilvancleve opened this issue Nov 20, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@wilvancleve
Copy link

This is a really great project, and I've had a lot of fun playing with this using schnell on my m1. I'm curious about other flux-derived models, in particular, Shuttle 3 Diffusion. Is there a way that quantized GGUF models can be converted to run a little faster on Apple Silicon using the mflux library?

@wilvancleve wilvancleve changed the title Curious is there's a way to use GGUF models? Curious if there's a way to use GGUF models? Nov 20, 2024
@filipstrand filipstrand added the enhancement New feature or request label Dec 27, 2024
@filipstrand
Copy link
Owner

filipstrand commented Dec 27, 2024

Thank you! Currently, the only supported models and formats are the diffusers version of flux schnell and dev or the ones saved from this project. But we have an open PR from @anthonywu which I'm planing on including in the next release, and it looks like that one works with shuttle-3-diffusion among others. As for GGUF format support, that would be a nice feature to support, but probably something that won't be prioritised for a while.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants