-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Issues: huggingface/peft
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Bug: BOFT forward/merging with CUDA
bug
Something isn't working
#2219
opened Nov 18, 2024 by
BenjaminBossan
4 tasks
训练时使用的QLoRA 4rank,进行cuda模型合并导出时出现,KeyError: 'base_model.model.model.model.layers.14.mlp.down_proj'
#2213
opened Nov 14, 2024 by
xiaoheiyue
4 tasks
TypeError: LoraConfig.__init__() got an unexpected keyword argument 'exclude_modules'
#2208
opened Nov 9, 2024 by
imrankh46
4 tasks done
modules_to_save Incorrect Overlap in Multiple LoRA Adapters
#2206
opened Nov 8, 2024 by
saeid93
2 of 4 tasks
RuntimeError: element 0 of tensors.. OpenCLIP model
#2200
opened Nov 5, 2024 by
EngEmmanuel
2 of 4 tasks
How to change 'modules_to_save' setting when reloading a lora finetuned model
#2188
opened Oct 30, 2024 by
dengchengxifrank
2 of 4 tasks
Xlora cannot reload model from last checkpoint by using trainer.train(resume_from_checkpoint="checkpp")
#2185
opened Oct 29, 2024 by
SongHanKen
3 of 4 tasks
Prompt Tuning Crash with Llama-3.2 in torch.embedding
#2161
opened Oct 18, 2024 by
hrsmanian
2 of 4 tasks
LoraConfig conflict when using
layers_to_transform
in LlamaModel
#2155
opened Oct 17, 2024 by
Evan02580
2 of 4 tasks
Key mismatch when trying to load a LORA adapter into an XLORA model
#2132
opened Oct 5, 2024 by
p4arth
2 of 4 tasks
merge_and_unload docs do not clarify behaviour for quantized base models
#2105
opened Sep 26, 2024 by
RonanKMcGovern
2 of 4 tasks
Deprecation: Transformers will no longer support Extra attention is needed
wip
past_key_values
to be tuples
contributions-welcome
help wanted
#1962
opened Jul 26, 2024 by
BenjaminBossan
Inference with different LoRA adapters in the same batch does not use the correct module_to_save classifier
contributions-welcome
wip
#1960
opened Jul 26, 2024 by
saeid93
2 of 4 tasks
[Call for contributions] help us improve LoKr, LoHa, and other LyCORIS
contributions-welcome
good-second-pr
#1935
opened Jul 18, 2024 by
sayakpaul
6 tasks
ProTip!
Adding no:label will show everything without a label.