- 
          
- 
                Notifications
    You must be signed in to change notification settings 
- Fork 10.9k
[Bugfix] Update Run:AI Model Streamer Loading Integration #23845
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bugfix] Update Run:AI Model Streamer Loading Integration #23845
Conversation
| 👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run  You ask your reviewers to trigger select CI tests on top of  Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add  If you have any questions, please reach out to us on Slack at https://slack.vllm.ai. 🚀 | 
2d65dca    to
    d86203a      
    Compare
  
    d86203a    to
    b7099e7      
    Compare
  
    Signed-off-by: Omer Dayan (SW-GPU) <omer@run.ai>
Signed-off-by: Peter Schuurman <psch@google.com>
b7099e7    to
    712e99e      
    Compare
  
    | Hey @DarkLight1337 . | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for fixing!
| Thanks for your work everyone - any timeline on the above getting merged/released? | 
| Retrying the failing tests to see if they are related to this PR | 
| Failing Checks: 
 | 
| @pwschuurman I use minio to save model, but canot running, get error info is  I should how to use. I can ensure this ak and sk is regiht. i can use this code download config.json file to   | 
| run-ai/runai-model-streamer#81 I found this project bug, current don't use. | 
…ct#23845) Signed-off-by: Omer Dayan (SW-GPU) <omer@run.ai> Signed-off-by: Peter Schuurman <psch@google.com> Co-authored-by: Omer Dayan (SW-GPU) <omer@run.ai> Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
…ct#23845) Signed-off-by: Omer Dayan (SW-GPU) <omer@run.ai> Signed-off-by: Peter Schuurman <psch@google.com> Co-authored-by: Omer Dayan (SW-GPU) <omer@run.ai> Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
…ct#23845) Signed-off-by: Omer Dayan (SW-GPU) <omer@run.ai> Signed-off-by: Peter Schuurman <psch@google.com> Co-authored-by: Omer Dayan (SW-GPU) <omer@run.ai> Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk> Signed-off-by: xuebwang-amd <xuebwang@amd.com>
…ct#23845) Signed-off-by: Omer Dayan (SW-GPU) <omer@run.ai> Signed-off-by: Peter Schuurman <psch@google.com> Co-authored-by: Omer Dayan (SW-GPU) <omer@run.ai> Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk> Signed-off-by: xuebwang-amd <xuebwang@amd.com>
Purpose
This PR fixes Run:AI Model Streamer PIP package (
runai-model-streamer) to the latest version (0.14.0).runai-model-streamer-gcsPIP packageTest Plan
Existing unit tests have been validated, and new unit tests have been tests/runai_model_streamer_test/test_runai_utils.py
In addition, model loading has been tested with
--load-format=runai_streamer, using models from local storage, S3 and GCS.Local Storage
S3 Compatible Endpoint
GCS Endpoint
Test Result
Essential Elements of an Effective PR Description Checklist
supported_models.mdandexamplesfor a new model.