You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a really great project, and I've had a lot of fun playing with this using schnell on my m1. I'm curious about other flux-derived models, in particular, Shuttle 3 Diffusion. Is there a way that quantized GGUF models can be converted to run a little faster on Apple Silicon using the mflux library?
The text was updated successfully, but these errors were encountered:
wilvancleve
changed the title
Curious is there's a way to use GGUF models?
Curious if there's a way to use GGUF models?
Nov 20, 2024
Thank you! Currently, the only supported models and formats are the diffusers version of flux schnell and dev or the ones saved from this project. But we have an open PR from @anthonywu which I'm planing on including in the next release, and it looks like that one works with shuttle-3-diffusion among others. As for GGUF format support, that would be a nice feature to support, but probably something that won't be prioritised for a while.
This is a really great project, and I've had a lot of fun playing with this using schnell on my m1. I'm curious about other flux-derived models, in particular, Shuttle 3 Diffusion. Is there a way that quantized GGUF models can be converted to run a little faster on Apple Silicon using the mflux library?
The text was updated successfully, but these errors were encountered: