Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does RTX3090 support training and inference of this network? #48

Open
KevinBanksB opened this issue May 10, 2022 · 1 comment
Open

Does RTX3090 support training and inference of this network? #48

KevinBanksB opened this issue May 10, 2022 · 1 comment

Comments

@KevinBanksB
Copy link

The author indicated that the graphics card device was Tesla V100, which is relatively high configuration requirements for some general lab. I would like to ask if the author has performed calculations on other relatively low-profile graphics cards, such as RTX30 series or RTX20 series graphics cards?

@arieling
Copy link
Collaborator

It's out of the scope of the current research project however It's definitely doable!
I don't have a RTX-GPU set up in our research environment.
Let me know if you want to test on RTX3090. If you post logs here we can discuss them from there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants