Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

Commit

Permalink
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Update docs
Browse files Browse the repository at this point in the history
dirkgr committed Apr 13, 2021

Verified

This commit was signed with the committer’s verified signature.
hameerabbasi Hameer Abbasi
1 parent 454ba3b commit 9555b2c
Showing 1 changed file with 0 additions and 1 deletion.
Original file line number Diff line number Diff line change
@@ -40,7 +40,6 @@ class PretrainedTransformerTokenizer(Tokenizer):
to their model.
max_length : `int`, optional (default=`None`)
If set to a number, will limit the total sequence returned so that it has a maximum length.
If there are overflowing tokens, those will be added to the returned dictionary
stride : `int`, optional (default=`0`)
If set to a number along with max_length, the overflowing tokens returned will contain some tokens
from the main sequence returned. The value of this argument defines the number of additional tokens.

0 comments on commit 9555b2c

Please sign in to comment.