You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
peft==0.6.2 accelerate==0.24.1 transformers==4.33.2 platform:linux python==3.9.0
Using QLoRA to fine-tune BaiChuan-13B reports AttributeError: 'Parameter' object has no attribute 'weight'.
The error location code is Its in /peft/tuners/adalora/bnb.py line 144.
After looking through the source code, I found that lora_A[active_adapter] is considered a nn.module in peft==0.6.2 because it has the weight attribute. But after the model is loaded, lora_A[active_adapter] is actually an nn.parameters, it is already the weight of the corresponding module. I would like to know why this is the case?
Who can help?
No response
Information
The official example scripts
My own modified scripts
Tasks
An officially supported task in the examples folder
My own task or dataset (give details below)
Reproduction
The error location code is Its in /peft/tuners/adalora/bnb.py line 144.
Expected behavior
Correctly obtain the corresponding attributes
The text was updated successfully, but these errors were encountered:
Thanks for the report. It looks like you're using AdaLoRA, not LoRA. For AdaLoRA, there was indeed a bug that was fixed in #1146. If you try using PEFT from the main branch, it should work.
Thanks for the report. It looks like you're using AdaLoRA, not LoRA. For AdaLoRA, there was indeed a bug that was fixed in #1146. If you try using PEFT from the main branch, it should work.
System Info
peft==0.6.2
accelerate==0.24.1
transformers==4.33.2
platform:linux
python==3.9.0
Using
QLoRA
to fine-tuneBaiChuan-13B
reportsAttributeError: 'Parameter' object has no attribute 'weight'.
The error location code is
Its in /peft/tuners/adalora/bnb.py line 144.
After looking through the source code, I found that
lora_A[active_adapter]
is considered ann.module
inpeft==0.6.2
because it has the weight attribute. But after the model is loaded,lora_A[active_adapter]
is actually annn.parameters
, it is already the weight of the corresponding module. I would like to know why this is the case?Who can help?
No response
Information
Tasks
examples
folderReproduction
The error location code is
Its in /peft/tuners/adalora/bnb.py line 144.
Expected behavior
Correctly obtain the corresponding attributes
The text was updated successfully, but these errors were encountered: