-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merge rocm-ci transformers changes from upstream #17
Conversation
Merge from HF/transformer master
Add ortmodule option to trainer
Updated option for training with onnxruntime training to '--ort'
Remove data based dependencies in T5 for ORT
…into raviskolli/ort
Raviskolli/ort
Fix for ortmodule + ds config
Raviskolli/ort
…into raviskolli/ort
Raviskolli/ort
Update for ORTModule package
Bert type cast fix
…into raviskolli/ort
Raviskolli/ort
* add ort config for debertav2 model * remove prints * remove old commented code * fix run style error * add flake ignore comment * trial to fix blackify format error
Remove model specific changes for BERT and DistilBERT
…module hack to make roberta can run it ortmodule
Update to import ORTModule from torch_ort as torch-ort is now public Don't need to explicitly use _original_module for ORT with this [PR](microsoft/onnxruntime#7847)
Update trainer.py
Permit DDP wrapping for ORTModule
The documentation is not available anymore as the PR was closed or merged. |
@pnunna93 , this PR is bringing in a lot of changes. Can we restrict it to a minimum that enables ORT in HF ? |
This PR is bringing ORT support to ROCm HF branch |
@amathews-amd, Created a separate PR with just ORT changes: Closing this one as its not needed anymore |
Merge upstream rocm-ci branch into this to be able to run ort HF workloads in DLM