You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, when I'm training LLAMA with Lora, i seem to get different results when loading Lora weights via get_peft_model and from_pretrained. Can't really tell why,both can infer successfully, but with vastly different results.
Could someone be so kind to tell me which is the correct way of doing this. Many thanks!
The text was updated successfully, but these errors were encountered:
Hi nuoma, could you please show me your code in saving & loading checkpoints? I'm finetuning llama with lora, and seem to encounter some issues when infering my checkpoint (saved by model.save_pretrained() and loaded by PeftModel.from_pretrained(base_mode, lora_ckpt_path)). Thanks a lot!
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Hi, when I'm training LLAMA with Lora, i seem to get different results when loading Lora weights via get_peft_model and from_pretrained. Can't really tell why,both can infer successfully, but with vastly different results.
Could someone be so kind to tell me which is the correct way of doing this. Many thanks!
The text was updated successfully, but these errors were encountered: