-
-
Notifications
You must be signed in to change notification settings - Fork 5.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please add lora support for higher ranks and alpha values #2847
Comments
Mark |
Bump |
It's not super well documented but you need to just pass in "-max-lora-rank 64" or whatever when serving since default is 16. python -m vllm.entrypoints.openai.api_server --max-lora-rank 64 |
Thanks for the answer, it helped me as well. For those who use code, it would be here:
|
Both answers work for me, up to rank 64. Rank > 64 is not supported yet. See #3934 |
Can we get Lora rank > 64 supported and merged? edit: im also curious if this was by design to support up to 64 rank, if so please let me know |
Bump. I need adapters that are much, much larger to be supported. Thanks |
Is there something special about lora rank >64. Wonder why only <=64 is supported |
same here, this is a blocker for me |
@JohnUiterwyk , has this not been fixed by the suggestions from @dspoka and @spreadingmind ? Their suggestions worked for me. |
No, as the maximum max_lora_rank is 64, going higher than that throws an error. I have adapters with rank 128 and 256 for certain uses cases, and can not serve them with vllm as a result of the hardcoded limit for the allowed value passed to max_lora_rank |
This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you! |
Any updates on this? Recent papers showed that a rank=256 seems to be very beneficial for example. I suspect this trend will continue to be the case and increasing in the near future. |
ValueError: LoRA rank 64 is greater than max_lora_rank 16.
The text was updated successfully, but these errors were encountered: