-
Notifications
You must be signed in to change notification settings - Fork 845
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fsspec for batch inference example #1891
Labels
Comments
Hi @msaroufim, Can you add me as a collaborator/contributor to pytorch/serve project. I would like to submit PR for this issue. Thanks |
Hi @kirkpa there is no need for that you can can create your own fork of pytorch/serve and then make a PR to this repo |
2 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
🚀 The feature
fsspec gives the ability to work with remote file systems like S3, GCS or Azure Blog storage with a single unified API
See this recent example for how broadly useful it is pytorch/data#812
We can create an example with some large dataset in some remote cloud storage, stream that data to some local client which then creates requests to torchserve for some larger scale batch inference with a large batch size
Motivation, pitch
See above
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: