Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug when pass subfolder parameter in peftmodel.load_adapter method! #718

Closed
2 of 4 tasks
ShayDuane opened this issue Jul 17, 2023 · 1 comment · Fixed by #721
Closed
2 of 4 tasks

Bug when pass subfolder parameter in peftmodel.load_adapter method! #718

ShayDuane opened this issue Jul 17, 2023 · 1 comment · Fixed by #721

Comments

@ShayDuane
Copy link

System Info

peft 0.4.0.dev0

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder
  • My own task or dataset (give details below)

Reproduction

Given a checkpoint folder as shown below.

drwxr-xr-x 2 root root     4096 Jul 16 09:54 LoraAdapter
-rw-r--r-- 1 root root 23443269 Jul 16 09:54 optimizer.pt
-rw-r--r-- 1 root root       27 Jul 16 09:54 README.md
-rw-r--r-- 1 root root    14575 Jul 16 09:54 rng_state.pth
-rw-r--r-- 1 root root      627 Jul 16 09:54 scheduler.pt
-rw-r--r-- 1 root root     7592 Jul 16 09:54 trainer_state.json
-rw-r--r-- 1 root root     3963 Jul 16 09:54 training_args.bin

LoraAdapter is the name I specified for the Lora adapter, which is a folder here containing the weight files and configuration files of the adapter.

-rw-r--r-- 1 root root      411 Jul 16 09:54 adapter_config.json
-rw-r--r-- 1 root root 11719577 Jul 16 09:54 adapter_model.bin

I want to continue training from this checkpoint, so I need to instantiate a PeftModel and load the weights of the Lora adapter into it. However, my adapter weight file is in a subfolder of the checkpoint, so I use the following method.

from transformers import AutoModel
from peft import PeftModelForCausalLM, LoraConfig

lora_config = LoraConfig.from_pretrained('ChatGlm2-6B_LoRA/QLora_Adapter_THUDM_chatglm2-6b/checkpoint-2000', subfolder='LoraAdapter')
base_model = AutoModel.from_pretrained('THUDM/chatglm2-6b', trust_remote_code=True, device_map='auto').half()
peftmodel = PeftModelForCausalLM(base_model, lora_config, adapter_name='LoraAdapter')

peftmodel.load_adapter(model_id='ChatGlm2-6B_LoRA/QLora_Adapter_THUDM_chatglm2-6b/checkpoint-2000', adapter_name='LoraAdapter', subfolder='LoraAdapter')

But it cannot correctly find my Lora adapter weight file.

I think the problem is here.

peft/src/peft/peft_model.py

Lines 484 to 502 in 029f416

def _split_kwargs(cls, kwargs: Dict[str, Any]):
_kwargs_not_in_hf_hub_download_signature = ("use_auth_token",)
hf_hub_download_kwargs = {}
other_kwargs = {}
for key, value in kwargs.items():
if key in inspect.signature(hf_hub_download).parameters or key in _kwargs_not_in_hf_hub_download_signature:
hf_hub_download_kwargs[key] = value
else:
other_kwargs[key] = value
return hf_hub_download_kwargs, other_kwargs
def load_adapter(self, model_id: str, adapter_name: str, is_trainable: bool = False, **kwargs: Any):
from .mapping import PEFT_TYPE_TO_CONFIG_MAPPING
hf_hub_download_kwargs, kwargs = self._split_kwargs(kwargs)
if adapter_name not in self.peft_config:

This method will put the 'subfolder' keyword argument into 'hf_hub_download_kwargs'.

So here

peft/src/peft/peft_model.py

Lines 519 to 522 in 029f416

# load weights if any
path = os.path.join(model_id, kwargs["subfolder"]) if kwargs.get("subfolder", None) is not None else model_id
if os.path.exists(os.path.join(path, SAFETENSORS_WEIGHTS_NAME)):

The 'subfolder' parameter is not present in kwargs anymore.

Expected behavior

please fix

@younesbelkada
Copy link
Contributor

Hi @RuSignalFlag
#721 should fix the issue, thanks a lot for noticing!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants