Skip to content

Adapters 0.1.0

Compare
Choose a tag to compare
@lenglaender lenglaender released this 24 Nov 10:10
4c00622

Blog post: https://adapterhub.ml/blog/2023/11/introducing-adapters/

With the new Adapters library, we fundamentally refactored the adapter-transformers library and added support for new models and adapter methods.

This version is compatible with Hugging Face Transformers version 4.35.2.

For a guide on how to migrate from adapter-transformers to Adapters have a look at https://docs.adapterhub.ml/transitioning.md.
Changes are given compared to the latest adapters-transformers v3.2.1.

New Models & Adapter Methods

Breaking Changes

Changes Due to the Refactoring

  • Refactored the implementation of all already supported models (@calpt, @lenglaender, @hSterz, @TimoImhof)
  • Separate the model config (PretrainedConfig) from the adapters config (ModelAdaptersConfig) (@calpt)
  • Updated the whole documentation, Jupyter Notebooks and example scripts (@hSterz, @lenglaender, @TimoImhof, @calpt)
  • Introduced the load_model function to load models containing adapters. This replaces the Hugging Face from_pretrained function used in the adapter-transformers library (@lenglaender)
  • Sharing more logic for adapter composition between different composition blocks (@calpt via #591)
  • Added Backwards Compatibility Tests which allow for testing if adaptations of the codebase, such as Refactoring, impair the functionality of the library (@TimoImhof via #596)
  • Refactored the EncoderDecoderModel by introducing a new mixin (ModelUsingSubmodelsAdaptersMixin) for models that contain other models (@lenglaender)
  • Rename the class AdapterConfigBase into AdapterConfig (@hSterz via #603)

Fixes and Minor Improvements

  • Fixed EncoderDecoderModel generate function (@lenglaender)
  • Fixed deletion of invertible adapters (@TimoImhof)
  • Automatically convert heads when loading with XAdapterModel (@calpt via #594)
  • Fix training T5 adapter models with Trainer (@calpt via #599)
  • Ensure output embeddings are frozen during adapter training (@calpt #537)