-
-
Notifications
You must be signed in to change notification settings - Fork 11.1k
Open
Labels
feature requestNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomershelp wantedExtra attention is neededExtra attention is needed
Description
🚀 The feature, motivation and pitch
We will drop support for V0 in the very near future, and if we want to continue supporting this model in vLLM, it needs to be ported to V1.
The main work here is in:
- Adapting the custom Mamba layer to match how V1 manages the mamba state. The MambaMixer, MambaMixer2, MinimaxLinearAttention and ShortConv can all be used as a reference for how to do this.
- Porting the differential attention backend to V1.
Alternatives
Drop model support in vLLM
Additional context
No response
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
Metadata
Metadata
Assignees
Labels
feature requestNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomershelp wantedExtra attention is neededExtra attention is needed