TypeError
when setting the adapter_config
argument for run_qa.py
#564
Labels
bug
Something isn't working
Environment info
adapter-transformers
version: 3.2.1Information
Model I am using (Bert, XLNet ...): Roberta
Language I am using the model on (English, Chinese ...): English
Adapter setup I am using (if any): I tried using the
pfeiffer+inv
andpfeiffer
configs ( as seen in case A & B below)The problem arises when using:
The tasks I am working on is:
To reproduce
Steps to reproduce the behavior:
Case A
run_qa.py
like so:Running the above yields the following error:
TypeError: transformers.adapters.configuration.AdapterConfigBase.load() argument after ** must be a mapping, not str
, as can be seen in:Case B
In my attempts to fix the problem, I refered to this to set the
--adapter_config
argument to the path to a json file (calledpfeiffer.json
with the contents found at this configuration page )So my
pfeiffer.json
looks like this:And my bash script now looks like this:
Running the above also yields the following error:
TypeError: transformers.adapters.configuration.AdapterConfigBase.load() argument after ** must be a mapping, not str
, as shown here:Expected behavior
I expected this model to run normally in adapter mode, withoud the need of full fine-tuning my model. That is, I expected the following results at the end of a successful completion of the script:
Complete Output
Thanks so much in advance!
The text was updated successfully, but these errors were encountered: