Skip to content

Conversation

@Cyrilvallez
Copy link
Member

@Cyrilvallez Cyrilvallez commented Sep 9, 2025

What does this PR do?

Apart from obvious tf/jax support, I believe the following should be the only potential breaking changes to torch-only code:

  • pipelines do not take framework argument anymore
  • onnx config methods do not take framework argument anymore

It may break current torch code if users do framework="pt" explicitly, but it's a necessary change. It makes no sense to keep those arguments, as the only framework working for those objects is now torch. Would be weird to keep it only for BC, as we are breaking the support anyway.

Note: I did not remove traces of tensorflow/jax in docs .md (markdown) files for now, as this PR is already enormous. It's a very tedious task, and moreover a lot of doc is written in another alphabet that I cannot read at all. Will be done in a subsequent PR, hopefully with the help of AI (should be a perfect fit for that)

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

This was referenced Sep 11, 2025
@Cyrilvallez Cyrilvallez changed the title Fully remove Tensorflow and Jax support library-wide 🚨🚨🚨 Fully remove Tensorflow and Jax support library-wide Sep 17, 2025
Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what a cleanup!
Be careful about the conversion scripts, we keep the ones that go from original -> torch

@@ -181,7 +174,7 @@ def _sanitize_parameters(
preprocess_params["prefix"] = prefix
if prefix:
prefix_inputs = self.tokenizer(
prefix, padding=False, add_special_tokens=add_special_tokens, return_tensors=self.framework
prefix, padding=False, add_special_tokens=add_special_tokens, return_tensors="pt"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i understand why you wanted it to return pt by default/ We could also have a "bool" return_tensors=True to return pt tensors

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In all the pipelines, self.framework would be set to pt for torch models during init. So I simply removed self.framework and made it explicit!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no no I mean because you have now to manually say "pt" I understand why you told me you wanna default to returning tensors in tokenizer haha

@Cyrilvallez Cyrilvallez merged commit 4df2529 into main Sep 18, 2025
21 of 24 checks passed
@Cyrilvallez Cyrilvallez deleted the the-great-cleaning branch September 18, 2025 16:27
vijayabhaskar-ev pushed a commit to vijayabhaskar-ev/transformers that referenced this pull request Oct 2, 2025
…#40760)

* setup

* start the purge

* continue the purge

* more and more

* more

* continue the quest: remove loading tf/jax checkpoints

* style

* fix configs

* oups forgot conflict

* continue

* still grinding

* always more

* in tje zone

* never stop

* should fix doc

* fic

* fix

* fix

* fix tests

* still tests

* fix non-deterministic

* style

* remove last rebase issues

* onnx configs

* still on the grind

* always more references

* nearly the end

* could it really be the end?

* small fix

* add converters back

* post rebase

* latest qwen

* add back all converters

* explicitly add functions in converters

* re-add
yuchenxie4645 pushed a commit to yuchenxie4645/transformers that referenced this pull request Oct 4, 2025
…#40760)

* setup

* start the purge

* continue the purge

* more and more

* more

* continue the quest: remove loading tf/jax checkpoints

* style

* fix configs

* oups forgot conflict

* continue

* still grinding

* always more

* in tje zone

* never stop

* should fix doc

* fic

* fix

* fix

* fix tests

* still tests

* fix non-deterministic

* style

* remove last rebase issues

* onnx configs

* still on the grind

* always more references

* nearly the end

* could it really be the end?

* small fix

* add converters back

* post rebase

* latest qwen

* add back all converters

* explicitly add functions in converters

* re-add
@LysandreJik LysandreJik mentioned this pull request Oct 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants