Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request for T5 gptq model support. #53

Open
sigmareaver opened this issue Jun 25, 2023 · 1 comment
Open

Request for T5 gptq model support. #53

sigmareaver opened this issue Jun 25, 2023 · 1 comment

Comments

@sigmareaver
Copy link

I attempted to load up flan-ul2 4-bit 128g gptq, but it looks like T5ForConditionalGeneration isn't supported, or perhaps Encoder/Decoder type LLMs in general.
In particular, https://github.com/qwopqwop200/transformers-t5 would also likely be needed to provide support for quantized T5.

@sigmareaver
Copy link
Author

If possible, I could do a pull request rather than burden you with a feature request. If you could let me know what files/functions I should look at to add support for a new model type, since I'm not familiar with KoboldAI's codebase, that should be enough to get me started.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant