Adapters v1.0.0
Blog post: https://adapterhub.ml/blog/2024/08/adapters-update-reft-qlora-merging-models
This version is built for Hugging Face Transformers v4.43.x.
New Adapter Methods & Model Support
- Add Representation Fine-Tuning (ReFT) implementation (LoReFT, NoReFT, DiReFT) (@calpt via #705)
- Add LoRA weight merging with Task Arithmetics (@lenglaender via #698)
- Add Whisper model support + notebook (@TimoImhof via #693; @julian-fong via #717)
- Add Mistral model support (@KorventennFR via #609)
- Add PLBart model support (@FahadEbrahim via #709)
Breaking Changes & Deprecations
- Remove support for loading from archived Hub repository (@calpt via #724)
- Remove deprecated add_fusion() & train_fusion() methods (@calpt via #714)
- Remove deprecated arguments in
push_adapter_to_hub()
method (@calpt via #724) - Deprecate support for passing Python lists to adapter activation (@calpt via #714)