You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I get the following error when using distilBERT to fine-tune on a text classification task:
tDBERT, learnerDBERT = ktrainModel("distilbert-base-uncased", max_words, class_names,
x_train_uncleaned_raw, y_train_uncleaned_raw-1,
x_test_uncleaned_raw, y_test_uncleaned_raw-1,
batchSizeTransformer)
Some weights of the model checkpoint at distilbert-base-uncased were not used when initializing TFDistilBertForSequenceClassification: ['vocab_projector', 'vocab_layer_norm', 'activation_13', 'vocab_transform']
- This IS expected if you are initializing TFDistilBertForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPretraining model).
- This IS NOT expected if you are initializing TFDistilBertForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of TFDistilBertForSequenceClassification were not initialized from the model checkpoint at distilbert-base-uncased and are newly initialized: ['classifier', 'pre_classifier', 'dropout_19']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
At least for my task, performance doesn't seem to be affected, but I was wondering if this error is problematic in any way.
The text was updated successfully, but these errors were encountered:
This is a weird but harmless warning that I believe was introduced in transformers==3.1.0 that is confusing many others as well. I have suppressed this warning in v0.21.0 of ktrain.
I get the following error when using distilBERT to fine-tune on a text classification task:
At least for my task, performance doesn't seem to be affected, but I was wondering if this error is problematic in any way.
The text was updated successfully, but these errors were encountered: