Skip to content

Commit

Permalink
Update readme & transitioning
Browse files Browse the repository at this point in the history
  • Loading branch information
calpt committed Apr 20, 2024
1 parent 51ba69a commit a15ac45
Show file tree
Hide file tree
Showing 3 changed files with 29 additions and 11 deletions.
15 changes: 11 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,8 @@ See the License for the specific language governing permissions and
limitations under the License.
-->

> **Note**: This repository holds the codebase of the _Adapters_ library, which has replaced `adapter-transformers`. For the legacy codebase, go to: https://github.com/adapter-hub/adapter-transformers-legacy.
<p align="center">
<img style="vertical-align:middle" src="https://raw.githubusercontent.com/Adapter-Hub/adapters/main/docs/logo.png" />
<img style="vertical-align:middle" src="https://raw.githubusercontent.com/Adapter-Hub/adapters/main/docs/img/adapter-bert.png" width="80" />
</p>
<h1 align="center">
<span><i>Adapters</i></span>
Expand All @@ -26,12 +24,21 @@ limitations under the License.
<h3 align="center">
A Unified Library for Parameter-Efficient and Modular Transfer Learning
</h3>
<h3 align="center">
<a href="https://adapterhub.ml">Website</a>
&nbsp; • &nbsp;
<a href="https://docs.adapterhub.ml">Documentation</a>
&nbsp; • &nbsp;
<a href="https://arxiv.org/abs/2311.11077">Paper</a>
</h3>

![Tests](https://github.com/Adapter-Hub/adapters/workflows/Tests/badge.svg?branch=adapters)
[![GitHub](https://img.shields.io/github/license/adapter-hub/adapters.svg?color=blue)](https://github.com/adapter-hub/adapters/blob/main/LICENSE)
[![PyPI](https://img.shields.io/pypi/v/adapters)](https://pypi.org/project/adapters/)

`adapters` is an add-on to [HuggingFace's Transformers](https://github.com/huggingface/transformers) library, integrating adapters into state-of-the-art language models by incorporating **[AdapterHub](https://adapterhub.ml)**, a central repository for pre-trained adapter modules.
_Adapters_ is an add-on library to [HuggingFace's Transformers](https://github.com/huggingface/transformers), integrating [various adapter methods](https://docs.adapterhub.ml/overview.html) into [state-of-the-art pre-trained language models](https://docs.adapterhub.ml/model_overview.html) with minimal coding overhead for training and inference.

> **Note**: The _Adapters_ library has replaced the `adapter-transformers` package. All previously trained adapters are compatible with the new library. For transitioning, please read: https://docs.adapterhub.ml/transitioning.html.
## Installation

Expand Down
Binary file added docs/img/adapter-bert.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
25 changes: 18 additions & 7 deletions docs/transitioning.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Transitioning from `adapter_transformers`
# Transitioning from `adapter-transformers`

```{eval-rst}
.. important::
``adapters`` is fully compatible to ``adapter-transformers`` in terms of model weights, meaning you can load any adapter trained with any version of ``adapter-transformers`` to the new library without degradation.
```

The new `adapters` library is the successor to the `adapter-transformers` library. It differs essentially in that `adapters` is now a stand-alone package, i.e., the package is disentangled from the `transformers` package from Hugging Face and is no longer a drop-in replacement.

Expand All @@ -9,17 +13,17 @@ This results in some breaking changes. To transition your code from `adapter-tra
To use the library you need to install
`transformers` and `adapters` in the same environment (unlike `adapter-transformers` which contained `transformers` and could not be installed in the same environment).

Run the following to install both (installing `adapters` will automatically trigger the installation of `transformers` if it is not yet installed in th environment):
Run the following to install both (installing `adapters` will automatically trigger the installation of a compatible `transformers` version):

```
pip install adapters
```

This also changes the namespace to `adapters`. For all imports of adapter classes change the import from `transformers` to `adapters`.
This mainly affects the following classes:
- AdapterModel classes, e.g. `AutoAdapterModel`(see [AdapterModels](https://docs.adapterhub.ml/model_overview.html) )
- AdapterModel classes, e.g. `AutoAdapterModel` (see [AdapterModels](https://docs.adapterhub.ml/model_overview.html) )
- Adapter configurations e.g. `PrefixTuningConfig` (see [Configurations](https://docs.adapterhub.ml/overview.html) )
- Adapter composition blocks, e.g. `Stack`(see [Composition Blocks](https://docs.adapterhub.ml/adapter_composition.html) )
- Adapter composition blocks, e.g. `Stack` (see [Composition Blocks](https://docs.adapterhub.ml/adapter_composition.html) )
- The `AdapterTrainer` class

## Model Initialisation
Expand Down Expand Up @@ -48,9 +52,9 @@ The `adapters` library supports the configuration of adapters using [config stri


For a complete list of config strings and classes see [here](https://docs.adapterhub.ml/overview.html). We strongly recommend using the new config strings, but we will continue to support the old config strings for the time being to make the transition easier.
Note that with the config strings the coresponding adapter config classes have changed, e.g. `PfeifferConfig` -> `SeqBnConfig`.
Note that with the config strings the corresponding adapter config classes have changed, e.g. `PfeifferConfig` -> `SeqBnConfig`.

Another consequence of this that the `AdapterConfig` class is now not only for the bottleneck adapters anymore, but the base class of all the configurations (previously `AdapterConfigBase`). Hence the function this class serves has changed. However, you can still load adapter configs with:
Another consequence of this that the `AdapterConfig` class is now not only for the bottleneck adapters anymore, but the base class of all the configurations (previously `AdapterConfigBase`). Hence, the function this class serves has changed. However, you can still load adapter configs with:
```
adapter_config = AdapterConfig.load("lora")
```
Expand All @@ -65,7 +69,8 @@ Compared to `adapter-transformers`, there are a few features that are no longer

## What has remained the same

The functionality for adding, activating, and training adapters has __not__ changed, except for the renaming of some adapter configs. You still add and activate adapters as follows:
- The new library is fully backwards compatible in terms of adapter weights, i.e. you can load all adapter modules trained with `adapter-transformers`.
- The functionality for adding, activating, and training adapters has __not__ changed, except for the renaming of some adapter configs. You still add and activate adapters as follows:
```
# add adapter to the model
model.add_adapter("adapter_name", config="lora")
Expand All @@ -75,3 +80,9 @@ model.set_active_adapters("adapter_name")
model.train_adapter("adapter_name")
```

## Where can I still find `adapter-transformers`?

The codebase of `adapter-transformers` has moved to [https://github.com/adapter-hub/adapter-transformers-legacy](https://github.com/adapter-hub/adapter-transformers-legacy) for archival purposes.

The full documentation of the old library is now hosted at [https://docs-legacy.adapterhub.ml](https://docs-legacy.adapterhub.ml/).

0 comments on commit a15ac45

Please sign in to comment.