v2.2.0
Based on transformers v4.11.3
New
Model support
T5
adapter implementation (@AmirAktify & @hSterz via #182)EncoderDecoderModel
adapter implementation (@calpt via #222)
Prediction heads
AutoModelWithHeads
prediction heads for language modeling (@calpt via #210)AutoModelWithHeads
prediction head & training example for dependency parsing (@calpt via #208)
Training
- Add a new
AdapterTrainer
for training adapters (@hSterz via #218, #241 ) - Enable training of
Parallel
block (@hSterz via #226)
Misc
- Add get_adapter_info() method (@calpt via #220)
- Add set_active argument to add & load adapter/fusion/head methods (@calpt via #214)
- Minor improvements for adapter card creation for HF Hub upload (@calpt via #225)
Changed
- Upgrade of underlying transformers version (@calpt via #232, #234, #239 )
- Allow multiple AdapterFusion configs per model; remove
set_adapter_fusion_config()
(@calpt via #216)