Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

initializing TFBertForTokenClassification: ['dropout_37'] #3

Open
gholinavaz opened this issue Jul 19, 2023 · 0 comments
Open

initializing TFBertForTokenClassification: ['dropout_37'] #3

gholinavaz opened this issue Jul 19, 2023 · 0 comments

Comments

@gholinavaz
Copy link

موقع اجرا نوت زیر نمایش داده میشود آیا مواردی است که باید اعمال شود ؟
Some layers from the model checkpoint at HooshvareLab/bert-base-parsbert-ner-uncased were not used when initializing TFBertForTokenClassification: ['dropout_37']

  • This IS expected if you are initializing TFBertForTokenClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
  • This IS NOT expected if you are initializing TFBertForTokenClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
    All the layers of TFBertForTokenClassification were initialized from the model checkpoint at HooshvareLab/bert-base-parsbert-ner-uncased.
    If your task is similar to the task the model of the checkpoint was trained on, you can already use TFBertForTokenClassification for predictions without further training.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant