Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fsspec for batch inference example #1891

Closed
msaroufim opened this issue Oct 6, 2022 · 2 comments
Closed

fsspec for batch inference example #1891

msaroufim opened this issue Oct 6, 2022 · 2 comments
Assignees
Labels
example good first issue Good for newcomers

Comments

@msaroufim
Copy link
Member

msaroufim commented Oct 6, 2022

🚀 The feature

fsspec gives the ability to work with remote file systems like S3, GCS or Azure Blog storage with a single unified API

See this recent example for how broadly useful it is pytorch/data#812

We can create an example with some large dataset in some remote cloud storage, stream that data to some local client which then creates requests to torchserve for some larger scale batch inference with a large batch size

Motivation, pitch

See above

Alternatives

No response

Additional context

No response

@msaroufim msaroufim added good first issue Good for newcomers example labels Oct 6, 2022
@msaroufim msaroufim self-assigned this Oct 13, 2022
@kirkpa
Copy link
Contributor

kirkpa commented Oct 20, 2022

Hi @msaroufim, Can you add me as a collaborator/contributor to pytorch/serve project. I would like to submit PR for this issue. Thanks

@msaroufim
Copy link
Member Author

Hi @kirkpa there is no need for that you can can create your own fork of pytorch/serve and then make a PR to this repo

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
example good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

2 participants