-
Notifications
You must be signed in to change notification settings - Fork 422
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] Support Mixtral 7Bx8E on H2O LLM Studio #536
Comments
https://huggingface.co/DiscoResearch/mixtral-7b-8expert That one is already supported, you will just need to update transformers from dev. Official HF release is not out yet. |
Thank you @psinger. I'll try this out. |
huggingface/transformers#27942 We can leave this issue open to track, also we need to update transformers. |
Release already out https://github.com/huggingface/transformers/releases/tag/v4.36.0 |
Mixtral is supported now after package upgrades. |
🚀 Feature
Support Mistral's new release - Mixtral 7Bx8E
Motivation
Mixtral matches or outperforms Llama 2 70B, as well as GPT3.5, on most benchmarks.
The text was updated successfully, but these errors were encountered: