Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using QLoRA to fine-tune BaiChuan-13B reports AttributeError: 'Parameter' object has no attribute 'weight' #1180

Closed
1 of 4 tasks
Aitejiu opened this issue Nov 24, 2023 · 2 comments

Comments

@Aitejiu
Copy link

Aitejiu commented Nov 24, 2023

System Info

peft==0.6.2
accelerate==0.24.1
transformers==4.33.2
platform:linux
python==3.9.0
Using QLoRA to fine-tune BaiChuan-13B reports AttributeError: 'Parameter' object has no attribute 'weight'.
The error location code is
image
Its in /peft/tuners/adalora/bnb.py line 144.
After looking through the source code, I found that lora_A[active_adapter] is considered a nn.module in peft==0.6.2 because it has the weight attribute. But after the model is loaded, lora_A[active_adapter] is actually an nn.parameters, it is already the weight of the corresponding module. I would like to know why this is the case?

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder
  • My own task or dataset (give details below)

Reproduction

The error location code is
image
Its in /peft/tuners/adalora/bnb.py line 144.

Expected behavior

Correctly obtain the corresponding attributes

@BenjaminBossan
Copy link
Member

Thanks for the report. It looks like you're using AdaLoRA, not LoRA. For AdaLoRA, there was indeed a bug that was fixed in #1146. If you try using PEFT from the main branch, it should work.

@Aitejiu
Copy link
Author

Aitejiu commented Nov 25, 2023

Thanks for the report. It looks like you're using AdaLoRA, not LoRA. For AdaLoRA, there was indeed a bug that was fixed in #1146. If you try using PEFT from the main branch, it should work.

Thanks for the answer, my problem is solved!

@Aitejiu Aitejiu closed this as completed Nov 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants