-
-
Notifications
You must be signed in to change notification settings - Fork 30k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bpo-43650: Fix MemoryError on zip.read in shutil._unpack_zipfile for large files #25058
Conversation
Hello, and thanks for your contribution! I'm a bot set up to make sure that the project can legally accept this contribution by verifying everyone involved has signed the PSF contributor agreement (CLA). Recognized GitHub usernameWe couldn't find a bugs.python.org (b.p.o) account corresponding to the following GitHub usernames: This might be simply due to a missing "GitHub Name" entry in one's b.p.o account settings. This is necessary for legal reasons before we can look at this contribution. Please follow the steps outlined in the CPython devguide to rectify this issue. You can check yourself to see if the CLA has been received. Thanks again for the contribution, we look forward to reviewing it! |
Still relevant |
@igorvoltaic: Status check is done, and it's a success ✅ . |
Thanks @igorvoltaic for the PR 🌮🎉.. I'm working now to backport this PR to: 3.10, 3.9. |
GH-26190 is a backport of this pull request to the 3.10 branch. |
GH-26191 is a backport of this pull request to the 3.9 branch. |
…large files (pythonGH-25058) `shutil.unpack_archive()` tries to read the whole file into memory, making no use of any kind of smaller buffer. Process crashes for really large files: I.e. archive: ~1.7G, unpacked: ~10G. Before the crash it can easily take away all available RAM on smaller systems. Had to pull the code form `zipfile.Zipfile.extractall()` to fix this Automerge-Triggered-By: GH:gpshead (cherry picked from commit f32c795) Co-authored-by: Igor Bolshakov <ibolsch@gmail.com>
…large files (pythonGH-25058) `shutil.unpack_archive()` tries to read the whole file into memory, making no use of any kind of smaller buffer. Process crashes for really large files: I.e. archive: ~1.7G, unpacked: ~10G. Before the crash it can easily take away all available RAM on smaller systems. Had to pull the code form `zipfile.Zipfile.extractall()` to fix this Automerge-Triggered-By: GH:gpshead (cherry picked from commit f32c795) Co-authored-by: Igor Bolshakov <ibolsch@gmail.com>
Sorry, it's not true. |
…large files (GH-25058) `shutil.unpack_archive()` tries to read the whole file into memory, making no use of any kind of smaller buffer. Process crashes for really large files: I.e. archive: ~1.7G, unpacked: ~10G. Before the crash it can easily take away all available RAM on smaller systems. Had to pull the code form `zipfile.Zipfile.extractall()` to fix this Automerge-Triggered-By: GH:gpshead (cherry picked from commit f32c795) Co-authored-by: Igor Bolshakov <ibolsch@gmail.com>
…large files (GH-25058) (GH-26190) `shutil.unpack_archive()` tries to read the whole file into memory, making no use of any kind of smaller buffer. Process crashes for really large files: I.e. archive: ~1.7G, unpacked: ~10G. Before the crash it can easily take away all available RAM on smaller systems. Had to pull the code form `zipfile.Zipfile.extractall()` to fix this Automerge-Triggered-By: GH:gpshead (cherry picked from commit f32c795) Co-authored-by: Igor Bolshakov <ibolsch@gmail.com>
good thing to look for. thankfully the with statement context manager closes both on the way out. |
…large files (pythonGH-25058) `shutil.unpack_archive()` tries to read the whole file into memory, making no use of any kind of smaller buffer. Process crashes for really large files: I.e. archive: ~1.7G, unpacked: ~10G. Before the crash it can easily take away all available RAM on smaller systems. Had to pull the code form `zipfile.Zipfile.extractall()` to fix this Automerge-Triggered-By: GH:gpshead (cherry picked from commit f32c795) Co-authored-by: Igor Bolshakov <ibolsch@gmail.com>
…large files (pythonGH-25058) `shutil.unpack_archive()` tries to read the whole file into memory, making no use of any kind of smaller buffer. Process crashes for really large files: I.e. archive: ~1.7G, unpacked: ~10G. Before the crash it can easily take away all available RAM on smaller systems. Had to pull the code form `zipfile.Zipfile.extractall()` to fix this Automerge-Triggered-By: GH:gpshead (cherry picked from commit f32c795) Co-authored-by: Igor Bolshakov <ibolsch@gmail.com>
shutil.unpack_archive()
tries to read the whole file into memory, making no use of any kind of smaller buffer. Process crashes for really large files: I.e. archive: ~1.7G, unpacked: ~10G. Before the crash it can easily take away all available RAM on smaller systems. Had to pull the code formzipfile.Zipfile.extractall()
to fix thishttps://bugs.python.org/issue43650
Automerge-Triggered-By: GH:gpshead