You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am extending the embeddings of the decoder of an EncoderDecoderModel model. When I save it, the config does not reflect the new size. However, it works fine when I try doing the same for non EncoderDecoderModel models.
To reproduce
In [1]: model = t.EncoderDecoderModel.from_encoder_decoder_pretrained('bert-base-uncased', 'bert-base-uncased')
In [2]: model.decoder.bert.embeddings.word_embeddings
Out[2]: Embedding(30522, 768, padding_idx=0)
In [3]: model.decoder.resize_token_embeddings(30522+100)
Out[3]: Embedding(30622, 768)
In [4]: model.save_pretrained('test-bert')
Expected behavior
The updated embedding size should be saved in config.json
The text was updated successfully, but these errors were encountered:
Environment info
transformers
version: 4.5.0Who can help
@patrickvonplaten, @patil-suraj
Information
I am extending the embeddings of the decoder of an
EncoderDecoderModel
model. When I save it, the config does not reflect the new size. However, it works fine when I try doing the same for nonEncoderDecoderModel
models.To reproduce
Expected behavior
The updated embedding size should be saved in
config.json
The text was updated successfully, but these errors were encountered: