fix(deps): Update security updates [SECURITY] #4154
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR contains the following updates:
==3.20.0→==3.20.1==3.26.1→==3.26.2==6.3.0→==6.4.0==2.5.0→==2.6.0GitHub Vulnerability Alerts
CVE-2025-68146
Impact
A Time-of-Check-Time-of-Use (TOCTOU) race condition allows local attackers to corrupt or truncate arbitrary user files through symlink attacks. The vulnerability exists in both Unix and Windows lock file creation where filelock checks if a file exists before opening it with O_TRUNC. An attacker can create a symlink pointing to a victim file in the time gap between the check and open, causing os.open() to follow the symlink and truncate the target file.
Who is impacted:
All users of filelock on Unix, Linux, macOS, and Windows systems. The vulnerability cascades to dependent libraries:
Attack requires local filesystem access and ability to create symlinks (standard user permissions on Unix; Developer Mode on Windows 10+). Exploitation succeeds within 1-3 attempts when lock file paths are predictable.
Patches
Fixed in version 3.20.1.
Unix/Linux/macOS fix: Added O_NOFOLLOW flag to os.open() in UnixFileLock._acquire() to prevent symlink following.
Windows fix: Added GetFileAttributesW API check to detect reparse points (symlinks/junctions) before opening files in WindowsFileLock._acquire().
Users should upgrade to filelock 3.20.1 or later immediately.
Workarounds
If immediate upgrade is not possible:
Warning: These workarounds provide only partial mitigation. The race condition remains exploitable. Upgrading to version 3.20.1 is strongly recommended.
Technical Details: How the Exploit Works
The Vulnerable Code Pattern
Unix/Linux/macOS (
src/filelock/_unix.py:39-44):Windows (
src/filelock/_windows.py:19-28):The Race Window
The vulnerability exists in the gap between operations:
Unix variant:
Windows variant:
Step-by-Step Attack Flow
1. Attacker Setup:
2. Attacker Creates Race Condition:
3. Victim Application Runs:
4. What Happens Inside os.open():
On Unix systems, when
os.open()is called:Without
O_NOFOLLOWflag, the kernel follows the symlink and truncates the target file.Why the Attack Succeeds Reliably
Timing Characteristics:
Success factors:
Real-World Attack Scenarios
Scenario 1: virtualenv Exploitation
Scenario 2: PyTorch Cache Poisoning
Why Standard Defenses Don't Help
File permissions don't prevent this:
Directory permissions help but aren't always feasible:
File locking doesn't prevent this:
Exploitation Proof-of-Concept Results
From empirical testing with the provided PoCs:
Simple Direct Attack (
filelock_simple_poc.py):virtualenv Attack (
weaponized_virtualenv.py):PyTorch Attack (
weaponized_pytorch.py):Discovered and reported by: George Tsigourakos (@tsigouris007)
CVE-2025-68480
Impact
Schema.load(data, many=True)is vulnerable to denial of service attacks. A moderately sized request can consume a disproportionate amount of CPU time.Patches
4.1.2, 3.26.2
Workarounds
CVE-2025-66019
Impact
An attacker who uses this vulnerability can craft a PDF which leads to a memory usage of up to 1 GB per stream. This requires parsing the content stream of a page using the LZWDecode filter.
This is a follow up to GHSA-jfx9-29x2-rv3j to align the default limit with the one for zlib.
Patches
This has been fixed in pypdf==6.4.0.
Workarounds
If users cannot upgrade yet, use the line below to overwrite the default in their code:
CVE-2025-66418
Impact
urllib3 supports chained HTTP encoding algorithms for response content according to RFC 9110 (e.g.,
Content-Encoding: gzip, zstd).However, the number of links in the decompression chain was unbounded allowing a malicious server to insert a virtually unlimited number of compression steps leading to high CPU usage and massive memory allocation for the decompressed data.
Affected usages
Applications and libraries using urllib3 version 2.5.0 and earlier for HTTP requests to untrusted sources unless they disable content decoding explicitly.
Remediation
Upgrade to at least urllib3 v2.6.0 in which the library limits the number of links to 5.
If upgrading is not immediately possible, use
preload_content=Falseand ensure thatresp.headers["content-encoding"]contains a safe number of encodings before reading the response content.CVE-2025-66471
Impact
urllib3's streaming API is designed for the efficient handling of large HTTP responses by reading the content in chunks, rather than loading the entire response body into memory at once.
When streaming a compressed response, urllib3 can perform decoding or decompression based on the HTTP
Content-Encodingheader (e.g.,gzip,deflate,br, orzstd). The library must read compressed data from the network and decompress it until the requested chunk size is met. Any resulting decompressed data that exceeds the requested amount is held in an internal buffer for the next read operation.The decompression logic could cause urllib3 to fully decode a small amount of highly compressed data in a single operation. This can result in excessive resource consumption (high CPU usage and massive memory allocation for the decompressed data; CWE-409) on the client side, even if the application only requested a small chunk of data.
Affected usages
Applications and libraries using urllib3 version 2.5.0 and earlier to stream large compressed responses or content from untrusted sources.
stream(),read(amt=256),read1(amt=256),read_chunked(amt=256),readinto(b)are examples ofurllib3.HTTPResponsemethod calls using the affected logic unless decoding is disabled explicitly.Remediation
Upgrade to at least urllib3 v2.6.0 in which the library avoids decompressing data that exceeds the requested amount.
If your environment contains a package facilitating the Brotli encoding, upgrade to at least Brotli 1.2.0 or brotlicffi 1.2.0.0 too. These versions are enforced by the
urllib3[brotli]extra in the patched versions of urllib3.Credits
The issue was reported by @Cycloctane.
Supplemental information was provided by @stamparm during a security audit performed by 7ASecurity and facilitated by OSTIF.
Release Notes
tox-dev/py-filelock (filelock)
v3.20.1Compare Source
What's Changed
Full Changelog: tox-dev/filelock@3.20.0...3.20.1
marshmallow-code/marshmallow (marshmallow)
v3.26.2Compare Source
Bug fixes:
2025-68480: Merge error store messages without rebuilding collections.Thanks 카푸치노 for reporting and :user:
deckar01for the fix.py-pdf/pypdf (pypdf)
v6.4.0Compare Source
Performance Improvements (PI)
Bug Fixes (BUG)
Documentation (DOC)
Maintenance (MAINT)
Full Changelog
urllib3/urllib3 (urllib3)
v2.6.0Compare Source
==================
Security
compressed HTTP content ("decompression bombs") leading to excessive resource
consumption even when a small amount of data was requested. Reading small
chunks of compressed data is safer and much more efficient now.
(
GHSA-2xpw-w6gg-jr37 <https://github.com/urllib3/urllib3/security/advisories/GHSA-2xpw-w6gg-jr37>__)virtually unlimited links in the
Content-Encodingheader, potentiallyleading to a denial of service (DoS) attack by exhausting system resources
during decoding. The number of allowed chained encodings is now limited to 5.
(
GHSA-gm62-xv2j-4w53 <https://github.com/urllib3/urllib3/security/advisories/GHSA-gm62-xv2j-4w53>__).. caution::
If urllib3 is not installed with the optional
urllib3[brotli]extra, butyour environment contains a Brotli/brotlicffi/brotlipy package anyway, make
sure to upgrade it to at least Brotli 1.2.0 or brotlicffi 1.2.0.0 to
benefit from the security fixes and avoid warnings. Prefer using
urllib3[brotli]to install a compatible Brotli package automatically.If you use custom decompressors, please make sure to update them to
respect the changed API of
urllib3.response.ContentDecoder.Features
HTTPHeaderDictusing bytes keys. (#​3653 <https://github.com/urllib3/urllib3/issues/3653>__)HTTPConnection. (#​3666 <https://github.com/urllib3/urllib3/issues/3666>__)#​3696 <https://github.com/urllib3/urllib3/issues/3696>__)Removals
HTTPResponse.getheaders()method in favor ofHTTPResponse.headers.Removed the
HTTPResponse.getheader(name, default)method in favor ofHTTPResponse.headers.get(name, default). (#​3622 <https://github.com/urllib3/urllib3/issues/3622>__)Bugfixes
urllib3.PoolManagerwhen an integer is passedfor the retries parameter. (
#​3649 <https://github.com/urllib3/urllib3/issues/3649>__)HTTPConnectionPoolwhen used in Emscripten with no explicit port. (#​3664 <https://github.com/urllib3/urllib3/issues/3664>__)SSLKEYLOGFILEwith expandable variables. (#​3700 <https://github.com/urllib3/urllib3/issues/3700>__)Misc
zstdextra to installbackports.zstdinstead ofzstandardon Python 3.13 and before. (#​3693 <https://github.com/urllib3/urllib3/issues/3693>__)BytesQueueBufferclass. (#​3710 <https://github.com/urllib3/urllib3/issues/3710>__)#​3652 <https://github.com/urllib3/urllib3/issues/3652>__)#​3638 <https://github.com/urllib3/urllib3/issues/3638>__)Configuration
📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).
🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.
♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
👻 Immortal: This PR will be recreated if closed unmerged. Get config help if that's undesired.
This PR has been generated by Renovate Bot.