-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Vulnerable to zip slip #71
Comments
Now 'npm audit' warns this vulnerability as a high risk one. I am using filePond in my React app and because FilePond is dependent to this package I got 9 High Risk vulnerabilities and none of them can be solved because there is no patch available yet. Please somebody who knows this package, submit a fix. |
what is the alternative solution for this decompress? any other library? |
NPM security needs to get a grip. This isn't a security problem. Does this bypass filesystem security? Nope. i.e. every file in an archive is an "arbitrary file". |
Maybe you could write somewhere or to some application folder that could be executed or cause errors to applications for example. In my opinion this isn't minor either. |
The problem isn't that the files are arbitrary, it's that unpacking an archive can result in files writing to arbitrary paths. So if you unpack to Definitely a security risk. |
@kevva @sindresorhus are you guys aware of this? |
The issue alone is not a security problem. But it opens the door for security issues as many packages depend on this one. I had a look at this issue and found out that it originates in the package decompress-tar. |
Nah. Only if you are doing something insane like running node as root on a web service, and using it to unpack user provided tarballs. Again, you can only write to the filesystem if you have permission.
Nerfing functionality because (if used improperly) it might be a security problem is not the same thing as nerfing functionality because there is no way to use it securely. For anything node can do on the filesystem, I can come up with a scenario where if the circumstances are setup just so it can do something arbitrary and bad. Understanding security means knowing when you have setup a scenario where an attacker can do something bad, and there is nothing you can do to prevent it. There is legitimate utility of "arbitrary paths" in a tarball. For instance, when the source of the tarball is known by the software author, and its paths are expected. Say I've written a command line utility, which is meant to write to the system. I'm gonna have to instruct my users to run it with Off the top of my head, here's one way decompress already mitigates this "problem":
Here's an idea: Add a But just be sure to semver this so that you recognize you are breaking the interface. |
After some investigation of the behavior of the command line utilities (and notwithstanding NPM security group's habitual and lazy scarlet-lettering of packages), I recant my objection on the grounds that this is a bug for other reasons. Looks like BSD tar (such as in Darwin) does allow I think the Linux behavior of treated Workaround with current version to sanitize arbitrary tarball:
|
Yes, this is very much a security risk. Running code as root is not a vulnerability in itself, its just not preferred practice if you can help it. It could be something as simple as automated script. There are still a lot of things only root can do. Linux has been doing a better jobs about giving privileges to non root user but there are still several things that require root access. Not to mention there are still several system that require root access for something as simple as running on any port under 1024. It might even be something more insidious like having ~/.node_modules in the path. Where it installs a local module that could overwrite modules a lot people install globally, like npm or webpack. Code by default should be secure, WITH the possible option of bypassing and warning the developer of possible issues. Most *nix archivers (e.g. tar/zip) do not allow extraction outside of the specified directory (think root for that dir) unless you specify a specific option to allow files outside of it. unzip requires -: -> https://linux.die.net/man/1/unzip |
If a sys admin or coder requires continuously running your node app as root to get ports below 1024 (or any other privileged system activity), frankly they are not competent to be writing/running web services. Sure, in the most naive setups you may need to use root in some capacity to "initiate" a service, but you would never maintain that webservice acting under the powers of root under normal operating circumstances, you would perform what needed to be done with privilege and then you would de-escalate your privileges, with
Also, you kinda missed the point. We don't nerf functionality in a language or "utility function" simply because it can be used badly. Again, "may cause a security consequences" and "no way to avoid security consequences" are different levels of risk, and the current behavior falls squarely in the former classification (it's quite possible to use the current version with no risk whatsoever, see my workaround from a previous post). Not high. |
That was a for instance, but it could be as simple is to access files that only root has access to, and only root SHOULD have access to. Saying you can't write code in NodeJS for root because it has unintended side-affected security issues is a terrible excuse.
The REST of the world basically thought it was a security issue, which is why it got patched (https://nvd.nist.gov/vuln/detail/CVE-2001-1267). Even if you don't think it was. -P was added to tar when they stopped allowing absolute paths to change files (hackers were putting in /etc/passwd and /etc/shadow in the tar). The problem was that they still allowed traversing outside of the extraction path which IS/WAS a security bug, and they patched it in 2001-08-27 (version 1.13.20)
https://access.redhat.com/security/cve/cve-2001-1267 Now, this vulnerability, was marked as low, but the nodejs one is marked as high. Not sure why they did that though. I personally don't think it warrants a high, maybe a medium at most. I'm not sure if this code does it, but another issue in tar was the ability to create a symbolic link to an absolute path and then extract to that link, which they fixed as well.
You do patch these issues even if it breaks(i think thats what you meant by "nerf") functionality because developers DON'T expect underlying code to have security problems. The default act of ANY "utitlity" should be SAFE/SECURE even if it breaks functionality. If that functionality is still needed, then the developer should provide a way to EXPLICITLY bypass said security. It should take effort to make things insecure because you are intentionally doing so, but it should not take effort to secure it. Almost every utility in the past has done so, and the ones that didn't, people are not using anymore. |
So you are saying we should patch, |
Umm no. Wow, you went from completely right to completely left. The people who patched tar, didn't tell the OS people to fix their code. The utility allowed it to happen, so they fixed it. When i tell tar/zip to extract an archive to a directoy, it shouldn't be expected that it'll change files outside of that directory. Thats completely insane. They require an option to say, "Yeah I want you to possibly do bad things here, but i'm giving you permission, because I know what I'm doing". It shouldn't be any different here. |
Obviously, my example is absurd. It's hyperbole to demonstrate that logically the cause of the vulnerability: "ability to write arbitrary files" is true of the node core utilities, of which this library is an extension. From a technical perspective (what your program should be able to do to the filesystem, given sufficient privilege), this is not a security problem, at all. But let me agree with you now. It might be, from a "social" perspective. In that a security exploit strategy can be to subvert the coder's social expectations of outcome. If we write the acceptance criteria of "an archive" such as (nevermind the different skillsets/expectations of the respective user bases of tar vs zip): An Archive (the archive rules):
If this set of assumptions is true for all "users" of decompress (historically this is more true for zip users than tar users), than this is both a bug, and also has the potential of being exploited (if and only if another security practice is subverted, such as needlessly giving write access to the filesystem in areas outside the domain of your application) My problem with this notice is that:
Alternative Social Understand of Archive: (one that could very well exist for many programmers)
Programmers operating under this understand will already be wary of blind trust of the contents of an archive, and when input sanitization is necessary, they will certainly do so (using the filter option). When I have historically executed an extraction of tar, it has always been my understanding that I'm extracting "these files with path instructions", when relevant, relative to this directory. Because what we do with computers is force multipliers building on different levels of shared understanding of abstract concepts and general purpose utility, there is no such thing as SAFE/SECURE, not objectively and not ultimately, only subjectively and circumstantially. There is only safe/secure in "this context", "this usage", with this "shared understanding", with "this desired outcome", and under "these circumstances" You can go about expected bumper rails in everything you use, but the only safe computer is one that is turned off. |
No, then it would impossible to write any utilities that DO add/update any system files, explicitly. The utility is the only place to know whats permissible or not. Also, it would prevent extraction into a temporary directory which is outside the cwd. Requiring a process to change directories in order to modify files and directories would cause too many side effects.
I agree, but what is expected from a security aspect is completely different than what its capable of. For example, if you write a utility to add users to /etc/passwd and /etc/shadow, then if it has access, it should be able to. But generalize code to extract from an archive, shouldn't arbitrarily delete all users from the system and change the root password because a third party archive sets it as a location.
Again, no, parent relative paths ARE allowed as long as they are contained within the extraction directory (think chroot) e.g. "test/../test.txt" is a perfectly valid file and should be extracted)
Agreed, but authors should be responsive to security concerns and when they don't fix (or just close the project down). It could also be positive too. I'm more likely to use third party, if they fix security issues in a timely manor (base on the complexity of the problem)
Completely true, but that might not be the case for packages that automate things.
Of course, but to treat fs.writeFile() and tar as being similar is disingenuous. writeFile modifies a single file and you have to specifically tell it the file to update. I wouldn't expect webpack to write to system configuration files. Wherever the code is that writes the file, its THEIR responsibility for sanitizing the filename. For example, JSZip doesn't have to worry about it because they give database back in a callback for the user of the utility to determine how they want to handle it. In that case, its the responsibility of the person using JSZip to sanitize the files written to the disk. if decompress were more like
Then I would say the problem does not lie with the decompress package but with whoever is using it.
Its always going to be blind trust, unless the programmers that are using code to decompress are creating the archives themselves (which most likely is never the case). |
It appears as decompress is vulnerable to archives containing files that hold path-traversal names such as
../../outside.txt
.As a PoC I have attached a .tar.gz-archive(slip.tar.gz) that will, when extracted, create a file in
/tmp/slipped.txt
.Use the example code to extract:
Note that this will not work out-of-the-box with .zip-archives since
yauzl
will throw an exception if the entry's filename contains "..
".However, since this package also supports symlinks we can use that instead to bypass this limitation.
But just adding a symlink to our desired target location won't do the trick - since the archive's files are being extracted asynchronous (index.js#44-73) we will end up in a race condition between extracting the symlink and the file referencing it.
By creating a directory and within that create a symlink to its' parent directory. Continue with creating a symlink to
/
and a file with a name using a looped structure of all symlinks to the target. We end up with something like;Adding this to an archive allows us to always win the race!
Here's a PoC
(slip.zip), that when extracted will create a file in
/tmp/slipped_zip.txt
. Demonstrating how this also works for zip-archives using symlinks.The text was updated successfully, but these errors were encountered: