Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading LORA on GPTQ fails with latest update #2623

Closed
1 task done
FartyPants opened this issue Jun 11, 2023 · 2 comments
Closed
1 task done

Loading LORA on GPTQ fails with latest update #2623

FartyPants opened this issue Jun 11, 2023 · 2 comments
Labels
bug Something isn't working

Comments

@FartyPants
Copy link
Contributor

FartyPants commented Jun 11, 2023

Describe the bug

I FOUND THE CULPRIT:
Two days ago was peft bumped from git+https://github.com/huggingface/peft@3714aa2fff158fdfa637b2b65952580801d890b2
to git+https://github.com/huggingface/peft@e45529b149c7f91ec1d4d82a5a152ef56c56cb94

When moving back to git+https://github.com/huggingface/peft@3714aa2fff158fdfa637b2b65952580801d890b2
The bellow works because in the old version peft didn't push kwargs to the PeftConfig

Loading ANY previously working LORA on 4-bit 128g GPTQ file fails

Relevant logs:
File "H:\oobabooga_windows\text-generation-webui\modules\LoRA.py", line 80, in add_lora_to_model
shared.model = PeftModel.from_pretrained(shared.model, Path(f"{shared.args.lora_dir}/{lora_names[0]}"), **params)
File "H:\oobabooga_windows\installer_files\env\lib\site-packages\peft\peft_model.py", line 169, in from_pretrained
PeftConfig.from_pretrained(model_id, subfolder=kwargs.get("subfolder", None), kwargs).peft_type
File "H:\oobabooga_windows\installer_files\env\lib\site-packages\peft\utils\config.py", line 114, in from_pretrained
config = cls(kwargs)
TypeError: PeftConfig.init() got an unexpected keyword argument 'dtype'

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

Loading TheBloke_Wizard-Vicuna-13B-Uncensored-GPTQ
then trying to load any LORA that worked previously, always result in the same error

Screenshot

No response

Logs

the code in lora.py add params['dtype'] and 'device_maps'

            if not shared.args.cpu:
                params['dtype'] = shared.model.dtype
                if hasattr(shared.model, "hf_device_map"):
                    params['device_map'] = {"base_model.model." + k: v for k, v in shared.model.hf_device_map.items()}
                elif shared.args.load_in_8bit:
                    params['device_map'] = {'': 0}


Will ultimatelly result in the error unexpected keyword argument 'dtype'
because kwargs are now sent there (unlike in previous pull)
down in peft config.py
config = cls(**kwargs)

System Info

windows
@FartyPants
Copy link
Contributor Author

I submitted PR
#2629

@FartyPants
Copy link
Contributor Author

FartyPants commented Jun 11, 2023

I'm closing it down, because the cause is known and it is not bug in webui

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant