You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've stumbled upon this issue that might get rid of that weird Some weights of the model checkpoint at bert-base-uncased were not used when initializing Bert... warnings that you probably noticed every time you run any notebook:)
The thing is, solution proposed there to simply increase logs verbosity level - transformers.logging.set_verbosity_error() does not necessarily align with the guidlines from our supervisor about the amount of logs. Would be nice to hear your take, whether it's worth pursuing - just one (two, if you count the import) line, but a general limitation to our logs as well:/
The text was updated successfully, but these errors were encountered:
I've stumbled upon this issue that might get rid of that weird
Some weights of the model checkpoint at bert-base-uncased were not used when initializing Bert...
warnings that you probably noticed every time you run any notebook:)The thing is, solution proposed there to simply increase logs verbosity level -
transformers.logging.set_verbosity_error()
does not necessarily align with the guidlines from our supervisor about the amount of logs. Would be nice to hear your take, whether it's worth pursuing - just one (two, if you count the import) line, but a general limitation to our logs as well:/The text was updated successfully, but these errors were encountered: