PreTrainedTokenizerFast._batch_encode_plus()
got an unexpected keyword argument 'split_special_tokens'
#30685
Closed
2 of 4 tasks
Labels
Core: Tokenization
Internals of the library; Tokenization.
System Info
Transformer version: 4.38.1
Platform: Ubuntu
Python version: 3.10.13
Who can help?
@ArthurZucker @younesbelkada
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Expected behavior
Must return this:
The text was updated successfully, but these errors were encountered: