-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Keep getting "has no attribute '_get_submodules'" trying to apply model #856
Comments
Could you please tell us what you're trying to achieve with your code (after |
That was the full stacktrace. I'm just trying to merge an lora with a model and save it. as detailed in these lines
Here's the entire log from bash to bash:
|
Is there a reason why you're not using |
Sorry is there an example posted on how to do this? All I’m trying to do is merge a model with a Lora and save |
AttributeError: 'LlamaForCausalLM' object has no attribute 'merge' |
That's probably because of what I mentioned:
You could either try again based on that branch or just wait a little bit more until we merge it in PEFT, which should hopefully be very soon. |
I’m sorry I don’t see the relevance… neither the model nor the Lora adapter are quantized. base_model = "TheBloke/Llama-2-7B-fp16" The model contains unquantized fp16 weights… how would a PR for loading quantized layers help? |
Ah yes, sorry. Could you try |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. |
System Info
peft version=0.6.0.dev0
accelerate version=0.22.0
transformers version=4.32.0
python version=3.8.10
platform:
/ssssssssssshdmmNNmmyNMMMMhssssss/ Kernel: 5.15.0-1042-azure
+ssssssssshmydMMMMMMMNddddyssssssss+ Uptime: 33 mins
/sssssssshNMMMyhhyyyyhmNMMMNhssssssss/ Packages: 699 (dpkg), 4 (snap)
.ssssssssdMMMNhsssssssssshNMMMdssssssss. Shell: bash 5.0.17
+sssshhhyNMMNyssssssssssssyNMMMysssssss+ Resolution: 1024x768
ossyNMMMNyMMhsssssssssssssshmmmhssssssso CPU: AMD EPYC 7763 (8) @ 2.445GHz
ossyNMMMNyMMhsssssssssssssshmmmhssssssso Memory: 643MiB / 257691MiB
+sssshhhyNMMNyssssssssssssyNMMMysssssss+
.ssssssssdMMMNhsssssssssshNMMMdssssssss.
/sssssssshNMMMyhhyyyyhdNMMMNhssssssss/
+sssssssssdmydMMMMMMMMddddyssssssss+
/ssssssssssshdmNNNNmyNMMMMhssssss/
.ossssssssssssssssssdMMMNysssso.
-+sssssssssssssssssyyyssss+-
:+ssssssssssssssssss+:
.-/+oossssoo+/-.
Who can help?
@pacman100 @younesbelkada
Information
Tasks
examples
folderReproduction
here's the script
Expected behavior
Model Downloads and loads into RAM (observed through htop)
Then a few minutes later met with
The text was updated successfully, but these errors were encountered: