We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,我看代码中加载模型都是用AutoModelForCausalLM的API,请问是否可以支持用vllm加载模型,提高推理速度?据我所知,vllm似乎也是通过PageAttention来管理内存空间中的 keys 和 values,这与infllm工作有冲突吗?
The text was updated successfully, but these errors were encountered:
有进展嘛
Sorry, something went wrong.
No branches or pull requests
您好,我看代码中加载模型都是用AutoModelForCausalLM的API,请问是否可以支持用vllm加载模型,提高推理速度?据我所知,vllm似乎也是通过PageAttention来管理内存空间中的 keys 和 values,这与infllm工作有冲突吗?
The text was updated successfully, but these errors were encountered: