ValueError: [E203] If the transformer embedding layer is not updated during training, make sure to include it in 'annotating components' #13332
-
Hi,
I'm getting the error - ValueError: [E203] If the transformer embedding layer is not updated during training, make sure to include it in 'annotating components' This is my output -
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Hi, thanks for the report! As I can't run your code, could you try something to confirm my suspicion? I think that the "annotating_components" feature is not actually compatible yet with the experimental |
Beta Was this translation helpful? Give feedback.
In cases where old data is unavailable, but the model's original performance on the old data was quite good (let's say >90% F), you could consider creating silver annotations with your model on texts resembling the old data - and then mixing that silver data in with the annotations on your new data. In general, I would advice to train from scratch (old+new data) instead of using the rehearsel functionality (which is indeed in experimental and as we found out, doesn't actually work with annotating components yet :/)