Skip to content

Commit

Permalink
Update Embedding Documentation (#706)
Browse files Browse the repository at this point in the history
Improve Embedding Documentation (#643 ) by:
- Adding an example notebook
- Updating the embedding page in the documentation

---------

Co-authored-by: calpt <calpt@mail.de>
Co-authored-by: Timo Imhof <timo.imhof.uni@gmail.com>
  • Loading branch information
3 people authored Jun 24, 2024
1 parent c243478 commit 3c2e702
Show file tree
Hide file tree
Showing 2 changed files with 793 additions and 3 deletions.
12 changes: 9 additions & 3 deletions docs/embeddings.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Embeddings

With `adapters`, we support dynamically adding, loading, and deleting of `Embeddings`. This section
will give you an overview of these features.
will give you an overview of these features. A toy example is illustrated in this [notebook](https://colab.research.google.com/github/Adapter-Hub/adapters/blob/main/notebooks/Adapter_With_Embeddings.ipynb).

## Adding and Deleting Embeddings
The methods for handling embeddings are similar to the ones handling adapters. To add new embeddings we call
Expand All @@ -12,13 +12,12 @@ is currently active, the `active_embeddings` property contains the currently act

```python
model.add_embeddings('name', tokenizer, reference_embedding='default', reference_tokenizer=reference_tokenizer)
embedding_name = model.active_embeddings
```

The original embedding of the transformers model is always available under the name `"default"`. To set it as the active
embedding simply call the `set_active_embedding('name')` method.
```python
model.set_active_embeddings("default")
model.set_active_embeddings('name')
```
Similarly, all other embeddings can be set as active by passing their name to the `set_active_embedding` method.

Expand All @@ -29,6 +28,13 @@ model.delete_embeddings('name')
```
Please note, that if the active embedding is deleted the default embedding is set as the active embedding.

## Training Embeddings
Embeddings can only be trained with an adapter. To freeze all weights except for the embedding and the adapter:
```python
model.train_adapter('adapter_name', train_embeddings=True)
```
Except for the `train_embeddings` flag, the training is the same as for just training an adapter (see [Adapter Training](training.md)).

## Saving and Loading Embeddings
You can save the embeddings by calling `save_embeddings('path/to/dir', 'name')` and load them with `load_embeddings('path/to/dir', 'name')`.

Expand Down
Loading

0 comments on commit 3c2e702

Please sign in to comment.