You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using distributed or parallel set-up in script?: Yes
Information
Model I am using (Bert, XLNet ...): EncoderDecoderModel
Language I am using the model on (English, Chinese ...): English
Adapter setup I am using (if any): AdapterConfig
The problem arises when using:
the official example scripts: (give details below)
my own modified scripts: (give details below)
The tasks I am working on is:
an official GLUE/SQUaD task: (give the name)
my own task or dataset: (give details below)
To reproduce
from transformers import EncoderDecoderModel, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model = EncoderDecoderModel.from_encoder_decoder_pretrained("bert-base-uncased", "bert-base-uncased")
model.add_adapter("pfeiffer")
model.set_active_adapters("pfeiffer")
text = "This is a test sentence."
inputs = tokenizer(text, return_tensors="pt")
model.generate(inputs.input_ids, bos_token_id=tokenizer.bos_token_id)
Error message:
Traceback (most recent call last):
File "/nfsshare/home/xiaozeguan/anaconda3/envs/tnmt/lib/python3.7/site-packages/IPython/core/interactiveshell.py", line 3553, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-16-61f95235ade8>", line 1, in <module>
model.generate(inputs.input_ids, bos_token_id=tokenizer.bos_token_id)
File "/nfsshare/home/xiaozeguan/anaconda3/envs/tnmt/lib/python3.7/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/nfsshare/home/xiaozeguan/anaconda3/envs/tnmt/lib/python3.7/site-packages/transformers/generation_utils.py", line 1343, in generate
inputs_tensor, model_kwargs, model_input_name
File "/nfsshare/home/xiaozeguan/anaconda3/envs/tnmt/lib/python3.7/site-packages/transformers/generation_utils.py", line 585, in _prepare_encoder_decoder_kwargs_for_generation
with ForwardContext(self, **encoder_kwargs):
File "/nfsshare/home/xiaozeguan/anaconda3/envs/tnmt/lib/python3.7/site-packages/transformers/adapters/context.py", line 86, in __init__
model.forward_context(self, *args, **kwargs)
File "/nfsshare/home/xiaozeguan/anaconda3/envs/tnmt/lib/python3.7/site-packages/transformers/adapters/model_mixin.py", line 794, in forward_context
context.prefix_states = self.base_model.prefix_tuning(*args, **kwargs)
File "/nfsshare/home/xiaozeguan/anaconda3/envs/tnmt/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1270, in __getattr__
type(self).__name__, name))
AttributeError: 'EncoderDecoderModel' object has no attribute 'prefix_tuning'
The text was updated successfully, but these errors were encountered:
Hey !
I am not sure if this could be a solution to your problem, but something like model.prefix_tuning = copy.copy(model.decoder.bert.prefix_tuning) seems to make the code works.
Environment info
adapter-transformers
version: 3.1.0Information
Model I am using (Bert, XLNet ...): EncoderDecoderModel
Language I am using the model on (English, Chinese ...): English
Adapter setup I am using (if any): AdapterConfig
The problem arises when using:
The tasks I am working on is:
To reproduce
Error message:
The text was updated successfully, but these errors were encountered: