You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If I want to use the SequenceClassifier pipeline for something like reranking, I am (sort of) able to do so using the exposed forward_t method. The problem is that I will need to first encode the inputs using the model's tokenizer. I can get a ref to the tokenizer using get_tokenizer, but if I want to pass in tokenizer params (IE max_len and device) to tokenizer.tokenize, I cannot get them from the SequenceClassificationModel, because they are private fields and there are not any get methods like there are for the tokenizer itself.
Alternatively, you could add a method to wrap calls to SequenceClassificationModel.tokenizer.tokenize and pass these parameter in from the model instance.
The text was updated successfully, but these errors were encountered:
If I want to use the
SequenceClassifier
pipeline for something like reranking, I am (sort of) able to do so using the exposedforward_t
method. The problem is that I will need to first encode the inputs using the model's tokenizer. I can get a ref to the tokenizer usingget_tokenizer
, but if I want to pass in tokenizer params (IEmax_len
anddevice
) totokenizer.tokenize
, I cannot get them from theSequenceClassificationModel
, because they are private fields and there are not any get methods like there are for the tokenizer itself.Alternatively, you could add a method to wrap calls to
SequenceClassificationModel.tokenizer.tokenize
and pass these parameter in from the model instance.The text was updated successfully, but these errors were encountered: