Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added tokenizer kwargs for fill mask pipeline #22995

Closed
wants to merge 1 commit into from

Conversation

sajeedmehrab
Copy link

@sajeedmehrab sajeedmehrab commented Apr 25, 2023

Added tokenizer kwargs for the fill mask pipeline, which enables to truncate/pad/specify max length etc for the tokenizer. Pipeline can be used as follows following the edit:

from transformers import pipeline
fill_mask_pipeline = pipeline(
'fill-mask',
model=model,
tokenizer=tokenizer,
device=0
)

tokenizer_kwargs = {'truncation':True, 'max_length':2048}
output = fill_mask_pipeline("Text to predict <mask>", **tokenizer_kwargs)

@sgugger
Copy link
Collaborator

sgugger commented Apr 25, 2023

cc @Narsil

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

@Narsil
Copy link
Contributor

Narsil commented Apr 25, 2023

Can we refactor in order to do :

 output = fill_mask_pipeline("Text to predict <mask>", tokenizer_kwargs=tokenizer_kwargs)

Instead ? Accepting kwargs directly is very hard to maintain down the line because of clashing arguments (for instance max_length is one that pops up often enough).

We can also whiteliste some parameters like truncation or padding to make them more convenient. but enabling all the kwargs directly is really not something we want I think.

Thanks for the contribution though, it's a step in the good direction !

@github-actions
Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants