Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIX Prompt learning with latest transformers error #2140

Conversation

BenjaminBossan
Copy link
Member

@BenjaminBossan BenjaminBossan commented Oct 9, 2024

The error in PEFT is occurring after this transformers change:

huggingface/transformers#33870

Now, in our tests, some model_kwargs no longer necessarily contain past_key_values, resulting in a KeyError. We now account for this possibility. Affected models were opt and gpt2.

Example of failing CI: https://github.com/huggingface/peft/actions/runs/11256453704/job/31298371856

Note that CI won't detect this issue as it requires the latest transformers install. I ran this locally using pytest tests/test_decoder_models.py -k "(prefix or prompt) and (opt or gpt2)" -v.

The error in PEFT is occurring after this transformers change:

huggingface/transformers#33870

Now, in our tests, some model_kwargs no longer necessarily contain
past_key_values, resulting in a KeyError. We now account for this
possibility. Affected models were opt and gpt2.
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@gante
Copy link
Member

gante commented Oct 9, 2024

Makes sense 👍 (and apologies for breaking things here)

@BenjaminBossan BenjaminBossan merged commit 1eab9bd into huggingface:main Oct 9, 2024
14 checks passed
@BenjaminBossan BenjaminBossan deleted the fix-transformers-update-prompt-learning-missing-past_key_values branch October 9, 2024 15:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants