Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can we use multiple GPU on inference? #902

Closed
tamyxgopenx opened this issue Feb 19, 2020 · 3 comments
Closed

Can we use multiple GPU on inference? #902

tamyxgopenx opened this issue Feb 19, 2020 · 3 comments
Labels
duplicate This issue or pull request already exists

Comments

@tamyxgopenx
Copy link

🚀 Feature

Prediction using multiple GPU without

Motivation

I want to speed up the prediction time with more GPUs

Pitch

It should be easier if you can use config for this, like config.CPU_DEVICE_COUNT = 2
Is there anyway to do it? i have seen python tools/train_net.py --num-gpus 4 --config-file configs/modanet.yaml but i guess it for training, i only want to predict frame.

@ppwwyyxx
Copy link
Contributor

See #716, #586, #33, etc.

@ppwwyyxx ppwwyyxx added the duplicate This issue or pull request already exists label Feb 19, 2020
@tamyxgopenx
Copy link
Author

tamyxgopenx commented Feb 21, 2020

@ppwwyyxx thanks for your answer but is there a function that make gpu parallel computing one frame only?

@ppwwyyxx
Copy link
Contributor

there isn't

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jan 2, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
duplicate This issue or pull request already exists
Projects
None yet
Development

No branches or pull requests

2 participants