-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[CLOSED] Updating to Transformers v2.6.0 #1059
Comments
Comment by sleepinyourhat Looks good so far! Thanks! |
Comment by sleepinyourhat Since you asked, this still looks good to me. If the testing infrastructure is all ready to go, though, it couldn't hurt to kick off tests with 2.8.0 now, too. |
Comment by zphang Updated the full table. I think we should be good to merge. Given the upcoming deadlines, I recommend waiting till after EMNLP to update transformers again. |
Comment by sleepinyourhat @zphang Why delay the merge? If it we've vetted it to our usual degree, than we should get some additional speedups/options out of this PR. Of course, it's not great to make major changes after a round of experiments has already started, but the solution to that would just be to maintain a separate branch for each major experiment, which is a good idea in any case. |
Comment by zphang @pyeres I've removed the commit concerting the XLMRoBERTaTokenizer. This PR should only update the requirements (transformers, and tokenizers). @sleepinyourhat To clarify, I support merging in the update to v2.6.0 now, and putting off the update to v2.8.0. |
Comment by zphang Unrelated: |
Issue by zphang
Friday Apr 10, 2020 at 21:37 GMT
Originally opened as nyu-mll/jiant#1059
Performance comparison on a set of representative tasks.
zphang included the following code: https://github.com/nyu-mll/jiant/pull/1059/commits
The text was updated successfully, but these errors were encountered: