Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-GPU training support #446

Closed
sleepinyourhat opened this issue Dec 21, 2018 · 6 comments
Closed

Multi-GPU training support #446

sleepinyourhat opened this issue Dec 21, 2018 · 6 comments
Assignees
Labels
help wanted Extra attention is needed jiant-v1-legacy Relevant to versions <= v1.3.2

Comments

@sleepinyourhat
Copy link
Contributor

No description provided.

@sleepinyourhat
Copy link
Contributor Author

cc @iftenney

@pruksmhc pruksmhc self-assigned this Jan 23, 2019
@sleepinyourhat
Copy link
Contributor Author

It looks like the Allen fundamentals all support this, though with some caveats: allenai/allennlp#2127

This is becoming increasingly urgent with BERT Large and XLNet—we can't use them at full sequence length on commodity GPUs.

@sleepinyourhat sleepinyourhat added help wanted Extra attention is needed high-priority Fix this before addressing any other major issue. and removed high-priority Fix this before addressing any other major issue. labels Jun 26, 2019
@HaokunLiu
Copy link
Member

May I ask what’s the status of this thread? Do you need some extra hands?

@sleepinyourhat
Copy link
Contributor Author

Check with @pruksmhc, but I don't think anything is happening with this now. If you want to try it, please do!

@pruksmhc
Copy link
Contributor

@HaokunLiu I'm actually working on this and planning to release a PR early next week!

@HaokunLiu
Copy link
Member

@pruksmhc Oh, that's great. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed jiant-v1-legacy Relevant to versions <= v1.3.2
Projects
None yet
Development

No branches or pull requests

4 participants