Skip to content

Conversation

ericspod
Copy link
Member

@ericspod ericspod commented Sep 18, 2025

Fixes #8341.

Description

This updates the version of Huggingface transformers to account for security alerts for old versions.

Types of changes

  • Non-breaking change (fix or new feature that would not break existing functionality).
  • Breaking change (fix or new feature that would cause existing functionality to change).
  • New tests added to cover the changes.
  • Integration tests passed locally by running ./runtests.sh -f -u --net --coverage.
  • Quick tests passed locally by running ./runtests.sh --quick --unittests --disttests.
  • In-line docstrings updated.
  • Documentation updated, tested make html command in the docs/ folder.

Signed-off-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com>
Signed-off-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com>
Copy link
Contributor

coderabbitai bot commented Sep 18, 2025

Walkthrough

  • Updated dependency constraints in docs/requirements.txt and requirements-dev.txt: replaced conditional "transformers>=4.36.0, <4.41.0; python_version <= '3.10'" with unconditional "transformers>=4.53.0".
  • In monai/networks/nets/transchex.py, added "_attn_implementation": "eager" to the bert_config passed to MultiModal.from_pretrained in Transchex.init.
  • No changes to public APIs or function/class signatures.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Title Check ✅ Passed The title "Transformers version" is short and directly related to the main change (updating the Huggingface transformers dependency in requirements files), so it communicates the primary intent succinctly for a quick scan.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.
Description Check ✅ Passed The PR adheres to the repository template: it includes "Fixes #8341", a brief "Description" that the transformers version is being updated for security, and the "Types of changes" checklist with the non‑breaking box checked. The description omits detailed rationale for choosing transformers>=4.53.0, changelog or explicit test/CI results, but those are non‑critical per the template. Overall the description is sufficiently complete for reviewers to proceed.
✨ Finishing touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
monai/networks/nets/transchex.py (2)

336-336: Attn impl key looks right; make it configurable and default-safe.

Using config._attn_implementation="eager" is compatible with Transformers ≥4.51 and aligns with the documented interface; accepted values include "eager", "sdpa", "flash_attention_2", etc. (huggingface.co)

Two tweaks:

  • Expose as an init arg (default "eager") instead of hard-coding.
  • Optionally prefer "sdpa" when available (PyTorch SDPA) for speed, falling back to "eager". (huggingface.co)

Apply within this hunk:

-            "_attn_implementation": "eager",
+            "_attn_implementation": attn_implementation,

Add outside this hunk (constructor signature and docstring param):

def __init__(..., attn_implementation: str = "eager", ...):
    """
    ...
    attn_implementation: attention backend for HF BERT layers ("eager", "sdpa", ...).
    """

If you prefer to auto-select SDPA when supported:

attn_implementation = "sdpa" if torch.backends.cuda.is_built() and hasattr(torch.nn.functional, "scaled_dot_product_attention") else "eager"

329-336: Minor: stale transformers_version metadata.

The bert_config still sets "transformers_version": "4.10.2" while the project now requires ≥4.53. Consider removing this field or updating it to avoid confusion. This field is informational; changing/removing it won’t affect behavior.

-            "transformers_version": transformers_version,
+            # Optional metadata; consider removing or updating to match installed Transformers.
+            # "transformers_version": transformers_version,
📜 Review details

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Knowledge base: Disabled due to Reviews -> Disable Knowledge Base setting

📥 Commits

Reviewing files that changed from the base of the PR and between 946cfdf and b52edd3.

📒 Files selected for processing (3)
  • docs/requirements.txt (1 hunks)
  • monai/networks/nets/transchex.py (1 hunks)
  • requirements-dev.txt (1 hunks)
🧰 Additional context used
📓 Path-based instructions (1)
**/*.py

⚙️ CodeRabbit configuration file

Review the Python code for quality and correctness. Ensure variable names adhere to PEP8 style guides, are sensible and informative in regards to their function, though permitting simple names for loop and comprehension variables. Ensure routine names are meaningful in regards to their function and use verbs, adjectives, and nouns in a semantically appropriate way. Docstrings should be present for all definition which describe each variable, return value, and raised exception in the appropriate section of the Google-style of docstrings. Examine code for logical error or inconsistencies, and suggest what may be changed to addressed these. Suggest any enhancements for code improving efficiency, maintainability, comprehensibility, and correctness. Ensure new or modified definitions will be covered by existing or new unit tests.

Files:

  • monai/networks/nets/transchex.py
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (19)
  • GitHub Check: build-docs
  • GitHub Check: flake8-py3 (codeformat)
  • GitHub Check: packaging
  • GitHub Check: quick-py3 (macOS-latest)
  • GitHub Check: quick-py3 (windows-latest)
  • GitHub Check: min-dep-py3 (3.11)
  • GitHub Check: quick-py3 (ubuntu-latest)
  • GitHub Check: flake8-py3 (pytype)
  • GitHub Check: flake8-py3 (mypy)
  • GitHub Check: min-dep-os (ubuntu-latest)
  • GitHub Check: min-dep-py3 (3.12)
  • GitHub Check: min-dep-pytorch (2.5.1)
  • GitHub Check: min-dep-py3 (3.9)
  • GitHub Check: min-dep-pytorch (2.7.1)
  • GitHub Check: min-dep-pytorch (2.8.0)
  • GitHub Check: min-dep-pytorch (2.6.0)
  • GitHub Check: min-dep-os (windows-latest)
  • GitHub Check: min-dep-os (macOS-latest)
  • GitHub Check: min-dep-py3 (3.10)
🔇 Additional comments (1)
requirements-dev.txt (1)

38-38: Guard against future major-version breaks: add an upper bound.
Cap at <5 until CI verifies v5.

File: requirements-dev.txt (line 38)

  • transformers>=4.53.0
  • transformers>=4.53.0,<5

Run quick checks in CI/local with Transformers 4.53.x and 4.56.x.

@ericspod ericspod mentioned this pull request Sep 19, 2025
55 tasks
@KumoLiu
Copy link
Contributor

KumoLiu commented Sep 19, 2025

/build

@KumoLiu KumoLiu merged commit 6327a86 into Project-MONAI:dev Sep 19, 2025
27 checks passed
arthurdjn pushed a commit to arthurdjn/MONAI that referenced this pull request Sep 29, 2025
Fixes Project-MONAI#8341.

### Description

This updates the version of Huggingface transformers to account for
security alerts for old versions.

### Types of changes
<!--- Put an `x` in all the boxes that apply, and remove the not
applicable items -->
- [x] Non-breaking change (fix or new feature that would not break
existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing
functionality to change).
- [ ] New tests added to cover the changes.
- [ ] Integration tests passed locally by running `./runtests.sh -f -u
--net --coverage`.
- [ ] Quick tests passed locally by running `./runtests.sh --quick
--unittests --disttests`.
- [ ] In-line docstrings updated.
- [ ] Documentation updated, tested `make html` command in the `docs/`
folder.

---------

Signed-off-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com>
arthurdjn pushed a commit to arthurdjn/MONAI that referenced this pull request Sep 29, 2025
Fixes Project-MONAI#8341.

### Description

This updates the version of Huggingface transformers to account for
security alerts for old versions.

### Types of changes
<!--- Put an `x` in all the boxes that apply, and remove the not
applicable items -->
- [x] Non-breaking change (fix or new feature that would not break
existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing
functionality to change).
- [ ] New tests added to cover the changes.
- [ ] Integration tests passed locally by running `./runtests.sh -f -u
--net --coverage`.
- [ ] Quick tests passed locally by running `./runtests.sh --quick
--unittests --disttests`.
- [ ] In-line docstrings updated.
- [ ] Documentation updated, tested `make html` command in the `docs/`
folder.

---------

Signed-off-by: Eric Kerfoot <17726042+ericspod@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

New high vuln found in transformers
2 participants