T5ForConditionalGeneration
: After calling adapters.init() the data_collator input misses attention_mask
#737
Labels
bug
Something isn't working
Environment info
adapters
version: 1.0Information
Model I am using (Bert, XLNet ...): T5ForConditionalGeneration
Language I am using the model on (English, Chinese ...): any
Adapter setup I am using (if any): Isn't dependent on the adapter setup.
Expected behavior
The input that the data collator gets should be the same, independent of if I use
T5ForConditionalGeneration
T5ForConditionalGeneration
withadapters.init(model)
orAutoAdapterModel
with sequence-to-sequence head.However, for T5ForConditionalGeneration this isn't true: When using the default HF
T5ForConditionalGeneration
or the AutoAdapterModel with seq2seq head, then the input for the data collator isdict_keys(['input_ids', 'attention_mask'])
. However, when usingthen the data collator receives only
dict_keys(['input_ids'])
, i.e. the "attention_mask" is missing!I tested it for other models & tasks:
Both these models don't show this bug. Output of the script below:
To reproduce
The output pasted below is coming from this script:
The text was updated successfully, but these errors were encountered: