You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- `transformers` version: 4.35.2
- Platform: Linux-5.15.0-73-generic-x86_64-with-glibc2.35
- Python version: 3.10.13
- Huggingface_hub version: 0.20.1
- Safetensors version: 0.4.1
- Accelerate version: 0.25.0
- Accelerate config: - compute_environment: LOCAL_MACHINE
- distributed_type: DEEPSPEED
- use_cpu: False
- debug: True
- num_processes: 4
- machine_rank: 0
- num_machines: 1
- rdzv_backend: static
- same_network: True
- main_training_function: main
- deepspeed_config: {'deepspeed_config_file': 'ds_config/ds_config-llama-7b.json', 'zero3_init_flag': True}
- downcast_bf16: no
- tpu_use_cluster: False
- tpu_use_sudo: False
- tpu_env: []
- dynamo_config: {'dynamo_backend': 'INDUCTOR', 'dynamo_mode': 'max-autotune', 'dynamo_use_dynamic': True, 'dynamo_use_fullgraph': True}
- PyTorch version (GPU?): 2.1.1+cu121 (False)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?: <fill in>
- Using distributed or parallel set-up in script?: <fill in>
adapters version: 0.1.0
Platform:
Python version: 3.10.13
PyTorch version (GPU?):
Details
Hello, I met this error when I migrated from adapter-transformers`` to adapters``` package
Traceback (most recent call last):
File "/home/farid/adapters-lid/fusionbert_exp/adapterfusion-languagefusion.py", line 250, in <module>
model.add_adapter_fusion(Fuse(*fused), config=fusion_config)
File "/home/mambaforge/envs/venvadapt/lib/python3.10/site-packages/adapters/model_mixin.py", line 652, in add_adapter_fusion
self.apply_to_adapter_layers(lambda i, layer: layer.add_fusion_layer(adapter_names))
File "/home/mambaforge/envs/venvadapt/lib/python3.10/site-packages/adapters/model_mixin.py", line 443, in apply_to_adapter_layers
fn(i, module)
File "/home/mambaforge/envs/venvadapt/lib/python3.10/site-packages/adapters/model_mixin.py", line 652, in <lambda>
self.apply_to_adapter_layers(lambda i, layer: layer.add_fusion_layer(adapter_names))
File "/home/mambaforge/envs/venvadapt/lib/python3.10/site-packages/adapters/methods/bottleneck.py", line 126, in add_fusion_layer
dropout_prob = fusion_config.dropout_prob or getattr(self.model_config, "attention_probs_dropout_prob", 0)
AttributeError: 'NoneType' object has no attribute 'dropout_prob'
Hey @faridlazuarda, on first glance, the code you shared looks correct. I've tried to reproduce the issue using your snippet with some example model/ adapters, but wasn't able to do so. Could you possibly share a minimal reproducible example that throws this error to help us investigate? Thanks!
This issue has been automatically marked as stale because it has been without activity for 90 days. This issue will be closed in 14 days unless you comment or remove the stale label.
Environment info
adapters
version: 0.1.0Details
Hello, I met this error when I migrated from
adapter-transformers`` to
adapters``` packagemy code:
Is this a bug or do I make mistake in my implementation?
Thank you for your help!
The text was updated successfully, but these errors were encountered: