Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Support Mixtral 7Bx8E on H2O LLM Studio #536

Closed
binga opened this issue Dec 11, 2023 · 5 comments
Closed

[FEATURE] Support Mixtral 7Bx8E on H2O LLM Studio #536

binga opened this issue Dec 11, 2023 · 5 comments
Labels
type/feature Feature request

Comments

@binga
Copy link

binga commented Dec 11, 2023

🚀 Feature

Support Mistral's new release - Mixtral 7Bx8E

Motivation

Mixtral matches or outperforms Llama 2 70B, as well as GPT3.5, on most benchmarks.

@binga binga added the type/feature Feature request label Dec 11, 2023
@psinger
Copy link
Collaborator

psinger commented Dec 11, 2023

https://huggingface.co/DiscoResearch/mixtral-7b-8expert

That one is already supported, you will just need to update transformers from dev.

Official HF release is not out yet.

@binga
Copy link
Author

binga commented Dec 11, 2023

Thank you @psinger. I'll try this out.

@binga binga closed this as completed Dec 11, 2023
@psinger
Copy link
Collaborator

psinger commented Dec 11, 2023

huggingface/transformers#27942

We can leave this issue open to track, also we need to update transformers.

@psinger psinger reopened this Dec 11, 2023
@maxjeblick
Copy link
Contributor

@psinger
Copy link
Collaborator

psinger commented Jan 15, 2024

Mixtral is supported now after package upgrades.

@psinger psinger closed this as completed Jan 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type/feature Feature request
Projects
None yet
Development

No branches or pull requests

3 participants