Skip to content

Conversation

kaixuanliu
Copy link
Contributor

@kaixuanliu kaixuanliu commented Oct 13, 2025

For full pipeline, related cut_context is set to True in L511 while for pipe_without_text_encoders, this param is set to False by default L116, hence there will be some minor difference between their outputs(in UNet L492 the mean value w/ and w/o cut_context is different). For platforms such as Intel XPU, they use their seperate generator L145, hence the default tolerance value 1e-4 is not enough. Here we explicitly set encode_prompt_inputs["_cut_context"] = True to align with the behavior of full pipeline. @DN6 , pls help review, Thx!

Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
@kaixuanliu
Copy link
Contributor Author

@a-r-r-o-w @DN6 , Hi, can you help review? Thx!

@regisss
Copy link
Contributor

regisss commented Oct 22, 2025

@kaixuanliu Does this only affect XPU? Or GPU too?
Can you also provide the test command to reproduce it please?

@kaixuanliu
Copy link
Contributor Author

kaixuanliu commented Oct 22, 2025

@regisss This PR also affects GPU, although on GPU, the tolerance value is OK. Steps to reproduce:
pytest -rA tests/pipelines/kandinsky3/test_kandinsky3_img2img.py::Kandinsky3Img2ImgPipelineFastTests::test_encode_prompt_works_in_isolation

@DN6
Copy link
Collaborator

DN6 commented Oct 23, 2025

@kaixuanliu better solution here is to change the default value to True in the encode_prompt signature since it's hardcoded in the pipeline anyway.

Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
@kaixuanliu
Copy link
Contributor Author

@DN6 I agree. Have updated the code.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Collaborator

@DN6 DN6 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks 👍🏽

@DN6 DN6 merged commit 85eb505 into huggingface:main Oct 23, 2025
9 of 11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants