-
Notifications
You must be signed in to change notification settings - Fork 27.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error in RAG finetuning script #8345
Comments
It is related with the optimizer initialization in the finetune.py script. Seems like even in the lightning_base.py there is no initialization for the optimizer. |
any idea on this? I managed to work around by calling the optimizer initialization inside the train_dataloader function in finetune.py. |
Well the optimizer/scheduler is already defined in examples/lightningbase.py in I think we should remove those lines transformers/examples/rag/finetune.py Lines 326 to 336 in 17b1fd8
I just tried to remove them and now I'm getting this other issue #7816 |
That what I was thinking since there is a specific def in lightningbase.py. |
Environment info
ut the missing fields in that output! -->
transformers
version: 3.4.0Who can help
@patrickvonplaten, @lhoestq
Information
I am using RAG fine-tuning script. During the fine-tuning process, it says
torch.nn.modules.module.ModuleAttributeError: 'GenerativeQAModule' object has no attribute 'opt'
The bug exactly appears in [line 332 in finetune.py (https://github.com/huggingface/transformers/blob/master/examples/rag/finetune.py#L332)
To reproduce
I have installed the transformers library from the source. Not the pip.
The text was updated successfully, but these errors were encountered: