Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pipeline: no side-effects on model.config and model.generation_config 🔫 #33480

Merged
merged 7 commits into from
Sep 18, 2024

Conversation

gante
Copy link
Member

@gante gante commented Sep 13, 2024

What does this PR do?

Fixes #33398 with two changes (and corresponding regression-preventing tests):

  • [major change] Pipelines now hold their own generation_config. Modifications to model.config or model.generation_config were redirected to the pipeline's generation_config, and thus avoiding side-effects on the model.
from transformers import pipeline, T5ForConditionalGeneration, AutoTokenizer
import tempfile

model = T5ForConditionalGeneration.from_pretrained("t5-small")
pipe = pipeline(
    model=model,
    tokenizer=AutoTokenizer.from_pretrained("t5-small", model_max_length=100),
    task="translation_en_to_de"
)

with tempfile.TemporaryDirectory() as tmpdirname:
    model.save_pretrained(tmpdirname)
    model_2 = T5ForConditionalGeneration.from_pretrained(tmpdirname)

# Prior to this PR, `model_2` != `model` because the pipeline initialization changed `model.config`!
  • [minor change] in Forbid PretrainedConfig from saving generate parameters; Update deprecations in generate-related code 🧹  #32659, we started moving custom generate parameterization in model.config to model.generation_config when we called model.save_pretrained(). When we moved a parameter, model.config set that parameter to None. This None was not being properly ignored due to the legacy code path (parameterization in model.config takes precedence when the user never explicitly defined a GenerationConfig). After this PR, None fields are ignored by the generation config.

@gante gante marked this pull request as ready for review September 13, 2024 17:36
@gante gante requested a review from LysandreJik September 13, 2024 17:36
@LysandreJik
Copy link
Member

Thanks! The changes make sense to me; I'd like to have @Rocketknight1's opinion as well

Copy link
Member

@Rocketknight1 Rocketknight1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This approach makes sense to me! I don't think breaking that dark pattern is too much of a problem.

The only question, though, is whether it would be better to just get rid of those side effects in the pipelines, rather than doing this copy to patch over the problem. Do you know how many pipelines do this?

@gante
Copy link
Member Author

gante commented Sep 17, 2024

The only question, though, is whether it would be better to just get rid of those side effects in the pipelines, rather than doing this copy to patch over the problem. Do you know how many pipelines do this?

@Rocketknight1 agreed 🙌

You raised a good question -- in a rush to get the problem solved, I forgot to see the forest from the trees. From a quick check, only pipelines that call generate actually have this pattern, where config is modified to control the generate call. Good thing we have created a tool precisely to avoid this pattern, the generation_config! :D

As such, instead of doing this copy to patch over the problem, I think we can instead execute the following plan:

  1. Text generation pipelines hold their own generation config
  2. We apply the task-specific parameters (sometimes present in model.config, e.g. here) exclusively on the pipeline's generation_config. In other words, we keep the model's config and generation_config untouched
  3. store prefix (an item that might present in the task-specific parameters but is not present in GenerationConfig) separately as a pipeline attribute
  4. redirect all references to generation parameters to the pipeline's generation_config
  5. we pass the pipeline's generation_config to generate in all pipelines

WDYT?

@Rocketknight1
Copy link
Member

Yes, that sounds good, and a lot cleaner!

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@gante gante changed the title Pipeline: hold copy of configs Pipeline: no side-effects on model.config and model.generation_config 🔫 Sep 17, 2024
@gante gante requested a review from Rocketknight1 September 17, 2024 18:13
@gante
Copy link
Member Author

gante commented Sep 17, 2024

@Rocketknight1 v2 is ready for a review 🤗

Copy link
Member

@Rocketknight1 Rocketknight1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, LGTM now! I think this approach is way cleaner.

@gante gante merged commit 7542fac into huggingface:main Sep 18, 2024
23 checks passed
@gante gante deleted the fix_33398 branch September 18, 2024 14:43
itazap pushed a commit to NielsRogge/transformers that referenced this pull request Sep 20, 2024
amyeroberts pushed a commit to amyeroberts/transformers that referenced this pull request Oct 2, 2024
BernardZach pushed a commit to BernardZach/transformers that referenced this pull request Dec 5, 2024
BernardZach pushed a commit to innovationcore/transformers that referenced this pull request Dec 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

GenerationConfig is not handled correctly when saving multi-task models
4 participants