-
Notifications
You must be signed in to change notification settings - Fork 346
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to load LoRA adapter?? #372
Comments
Hi @momozzing, thanks for reporting this. I believe this is an issue with the current version of our library that should be solved by #378. |
Hello ! I have the same problem unfortunately. Is it the bug resolved or we should wait until some other checks? |
@calpt Wow~ Thank you for solving this problem!
@Ch-rode The code has not been merged yet. Then it works well! |
The mentioned fix is now merged into the master branch, therefore this issue should be resolved when installing from the latest master branch version. |
Thank you for the open-source adapter!!
I'm using LoRA Adapter from adapter-hub
but not a working load adapter...
train setting.
training finish
Load setting.
but not working load_adapter...
How shall I do it?
my error message
Some weights of the model checkpoint at model_save/gpt2-20 were not used when initializing GPT2AdapterModel: ['transformer.h.2.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.11.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.6.attn.c_attn.loras.LoRA.lora_A', 'transformer.h.10.attn.c_attn.loras.LoRA.lora_A', 'transformer.h.3.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.0.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.2.attn.c_attn.loras.LoRA.lora_A', 'transformer.h.5.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.6.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.9.attn.c_attn.loras.LoRA.lora_A', 'transformer.h.8.attn.c_attn.loras.LoRA.lora_A', 'transformer.h.1.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.4.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.11.attn.c_attn.loras.LoRA.lora_A', 'transformer.h.4.attn.c_attn.loras.LoRA.lora_A', 'transformer.h.0.attn.c_attn.loras.LoRA.lora_A', 'transformer.h.5.attn.c_attn.loras.LoRA.lora_A', 'transformer.h.9.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.10.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.8.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.7.attn.c_attn.loras.LoRA.lora_B', 'transformer.h.1.attn.c_attn.loras.LoRA.lora_A', 'lm_head.weight', 'transformer.h.3.attn.c_attn.loras.LoRA.lora_A', 'transformer.h.7.attn.c_attn.loras.LoRA.lora_A']
- This IS expected if you are initializing GPT2AdapterModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing GPT2AdapterModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
The text was updated successfully, but these errors were encountered: