-
Notifications
You must be signed in to change notification settings - Fork 133
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Removing primordials
from Node.js project
#1438
Comments
Can we have a draft to see exactly how much performance improvement are we talking about? |
I don't think this proposal is reasonable, primordials are a useful tool for reliability of the software, and most of the time have no impact on performance. Also as you mentioned, the TSC has already voted for the use of primordials in the error path, so your proposal should be compatible with that vote.
That's not a con, it's just a fact.
We do not use primordials with the purpose of security.
It's really unfair to state "we never follow them", in reality those changes tend to be watched carefully and not land if perf regression is shown.
Well what does it tell you? Not every part of
It's already in place, we have the
That's not a con either. |
I feel strongly that we should keep primordials when it is required for web standards (for example EventTarget should be reasonably tamper proof because not doing so would violate spec semantics). I also feel strongly we should keep primordials in "critical" error paths that must never throw. I am +1 on removing primordials everywhere else, multiple people mentioned they are a burden to contribution and they make the code less readable, debuggable and fun to work on, even if performance was not an issue. They create friction in PRs, cause CI reruns and often needless churn. To be clear - it's not that they're not valuable it's that their value is greatly outweighed by their burden IMO. It's important to echo @aduh95 's point - primordials are about reliability and are not a security mechanism, it does not provide tamper proofing nor does it sandbox anything. @aduh95 the fact a TSC vote happened doesn't mean the TSC can't change its mind? Personally I'm a lot less pro-primordials than I was 2 years ago given feedback people several people gave me about how it impacts their contribution experience. Also the TSC itself changes in composition. |
A JS platform that falls apart when someone does a normal JavaScript action would be an embarrassment. Primordials may not be the best or only way to solve this problem, but simply not solving it is something I would hope is deemed wildly unacceptable. |
I'm not sure, this has never happened before AFAICT. I still think Yagiz should try to draft a proposal that's compatible with it, or at least that does not overlap with it, so the discussion can be more productive. Depending on the issue of that proposal, you could consider overriding the previous vote, but now's not the time IMO. |
This is more of an opinion than a statement of fact. IMHO even though there are clear benefits, the cons outweigh the pros. @anonrig I don't think the current consensus of only using primordial for the error path is too bad. |
I think trying things and learning from them is a strength, the TSC was super opposed to promises at one point for example to the point people were actively mocking me for advocating them (maybe it was the CTC? it was a long time ago when that was still a thing :)). I change my mind around a lot of things personally though admittedly most of those things weren't a TSC vote. I think it would be productive to focus on arguments for and against usage of primordials (generally and in specific cases) though to be clear if my PoV doesn't reach consensus (primordials in critical error paths and web standards and not elsewhere) I'd be fine with that.
I don't think that's a productive statement especially given none of the userland packages use them. So I don't think we're giving a meaningful guarantee with them anyway. Additionally I don't like the language "would be an embarrassment" since "to whom?" and also it assumes other peoples' feelings about stuff.
The issue isn't "primordials solve nothing so let's remove them" it's "the new problems they create in our code and the friction they add are maybe not worth the thing they're solving for us". |
@benjamingr +1 on that |
@benjamingr all of my userland packages do, in fact, use them. I apologize for the choice of language; how better could i indicate how serious I think this is? |
First of all - 😮 😅
"Hey, I feel strongly that we should keep using primordials. They are beneficial because they prevent issues like A,B or C which are important to users and people have run into (e.g here are some issues people opened complaining about it D, E or F). I think the project would lose from removing them." Basically just objective unambiguous facts about why you think your opinion is correct would have the best chances of impacting the peoples' (and the TSC's) opinion, ideally with numbers. |
the nature of javascript is such that you can't defend against that, but, if your code is first-run, you can cache everything you need.
I use https://npmjs.com/call-bind and https://npmjs.com/get-intrinsic to achieve it.
I agree! but that doesn't mean that node shouldn't be robust against that. Lots of use cases for node have zero dependencies - especially with the test runner, where deleting builtins is actually a more common thing to simulate different environments and test an implementation against it, and robustness in that case is important
Thanks, I appreciate that feedback and will try to incorporate it. |
@benjamingr it's all fine if TSC members change their mind, I do that all the time too, highly recommend it, past Antoine was a complete idiot most of the time. Still the TSC decision needs to be somewhat final otherwise we cannot move forward. I don't know if the TSC charter says anything about it, but anyway, it's going to create off-topic discussions to try to override an existing vote, I think we can discuss primordials while accepting the outcome of the previous vote. My personal opinion on primordials is that it provides more upside than downsides; I don't know if it's because I lack empathy or something, but I don't think it creates a lot of friction in PRs, and it doesn't seem to me that it's a burden to contribution. If anything, I think it makes the code more fun and debuggable, but I've learned that I might be in the minority here. |
Yes but if you load "Package A" that changes a prototype then "Package B" that comes following it will get a polluted prototype. This can actually be useful (e.g. for instrumentation) and it would make certain use cases impossible (e.g. people writing instrumentation "deeply" hooking a server library). I do think it would be very helpful to be able to provide this guarantee (though for me it's only meaningful if it's a guarantee for actual user code which is likely to use many dependencies). It would be ideal for the project to promote a proposal to address this at the ECMAScript level (which is where I believe this naturally belongs) rather than attempt to ship "semi correct" approaches that provide very fragile guarantees.
The TSC vote about primordials was a year and a half ago, quite a respectable amount of time in software I think? I'm personally in favor of keeping primordials in error paths (and was not part of that vote) but I think it's fine to raise the issue again and see if the opinion of folks changed.
I acknowledge that for some people (like yourself and probably others) they don't create a burden to contribution. I don't understand how they make stuff more fun or debuggable though :D I've had several contributors complain about it to me in the past. I personally find them a bit tedious (a bit more so recently) but I've reviewed, approved and helped land a bunch of primordials changes before that perspective shift. Even before that it was "something I put up with to contribute" like GYP, different/weird debugging/testing flows etc. |
@benjamingr yes, that's why the first thing you load in an application, to be safe, should be the code that sets up the environment - shims/polyfills, things that can be cached like get-intrinsics, etc. I 100% believe this should be handled in TC39 itself, but I don't think node should give up this important robustness property in the meantime. |
I would love to not need primordials (not least because they're confounding to new contributors—if Antoine wasn't here, we'd be kinda screwed). However, they provide a very important function that I think we can't just decide to stop caring about. If we had some way to not need them, I'm all ears! |
This is the rule we’re supposed to be following? I thought I was supposed to use primordials everywhere, for any method for which we had a primordial available. You’re telling me I could have limited primordials to just error paths this whole time? |
This is incorrect. Any prototype pollution is accepted as a vulnerability by our threat model, i.e. causing a modification of a prototype as a side effect of calling a function in Node.js core. However, internals that are not hardened against a prototype pollution that already happened are not vulnerabilities, because they are unexploitable. Hopefully this distinction is clear. |
Primordials are a "least bad" solution, and even then, only because there are so few solutions available for the problem they're intended to solve. Reading through this, and some of the other discussions around primordials, and having done some work to in the efforts to get minipass and glob into core, I think a few things should be uncontroversial. (If you disagree, or think I'm overstating something or making an unfair generalization, I'm curious about it! That's not my intent.)
I don't know of any alternative way to achieve the goals, but there are things that could maybe be explored to reduce the costs, which I've heard suggested before:
Primordials are a low-value local-maximum solution; just good enough to keep better solutions from being explored, but not actually very good. None of the better solutions are particularly easy or satisfying, either, so the moat around it is deep. |
I'd also like to note that I have been pursuing multiple proposals, over years, and with years to come, with the goal of making primordial-esque safety much more ergonomic for JS developers - and I've used (and will be using) node's primordials (and their UX difficulties) as a big part of the motivation. In other words, I don't think this primordials approach is going to be necessary in node forever, but I think it should remain until there's a better option. |
@isaacs I mostly agree with your summary and think it represents the opinion of the majority fairly. If I may, I'd like to note a few things regarding your suggestions:
I don't think it's achievable, otherwise we would have done it. It might possible with some stricter variant of TypeScript, but inferring from "normal" JS what primordials to use is not possible.
Regarding TC39, if you read through tc39/proposal-call-this#8 it seems they won't be interesting in offering new syntax for a problem that's deemed too niche.
It seems a bit contradictory: either we want primordials usage maximized as much as possible, and we use benchmark CIs to detect when it's not a good idea; either we want it minimized as much as possible, and benchmarks are not very important. |
Either primordials are necessary for security or they’re not. If there are certain parts of Node where we’re like, “nah, don’t bother with primordials here because this code is too important,” then why use them anywhere? Only some parts of Node are worth protecting from intrusion, and not others? |
I agree with @GeoffreyBooth 100%. |
I totally agree that all parts should be covered by primordials. |
It's worth considering that the cost of primordials is uneven. A lot of the perf impact is in the prototype methods that get wrapped in a closure so it can look like a regular function that shifts it's "this" value into the first parameter position. Would the perf be better if prototype primordials instead just were the original function and had to be called with ReflectApply in-place? The other one is the Safe* types--do we know the perf difference between building those versus just using the primordial functions on the plain types? I suppose with those types we want the objects to remain tamper proof after handing them to user code? Can we achieve that correctly by freezing the objects we expose instead? |
Yes, of course some parts are more critical or sensitive than others. You might want to protect something like EventTarget, or process.exit or default constructors, but other performance-critical parts would be better off being more performant or readable instead. The project trusts the expertise of the team members to weigh the pros/cons in a lot of situations. Trade-off are used in a lot of parts of code (in general), and primordials (which I very much dislike. Their pros are IMO overblown and cons downplayed) are no different. |
Just my two cents. While both performance and avoiding prototype pollution are important. I think contributor accessibility and ergonomics are very important. People work on Node mostly for free because they find it fun and interesting. If we don't value that we will turn away developers and stagnate. That being said, I'm sure that there are developers working on Node because they think primordials is an important and interesting topic. But I'm not sure that's where the majority lies. While we haven't yet done a post-mortem on what motivated Bun over working on improving Node, I personally believe that If we don't keep Node interesting and accessible for contributors we will end up losing valuable resources and work to alternatives like Bun, Deno etc... |
Summarizing:
|
Practically, I think it is unlikely that we will be able to remove all primordials. It might be better if we solve this problem on a case-by-case basis because there are places where it makes sense to use primordials and there are places where it doesn't. If we find a part of the code that is slow because of primordials, I would suggest removing uses of primordials from there and present the benchmark results. If the reviewers are convinced that the performance improvement is worth more than the robustness provided by the primordials, the change is going to land. I agree that the usage of primordials does make it hard for contributors but I don't think it's something that would prevent folks from contributing. Anyone who is not familiar with primordials can submit a PR that doesn't use primordials at all and collaborators can guide them to use primordials appropriately. |
That's easier said than done, it's not something the TSC can decide, it can only happen if contributions are made to make it happen. Primordials were introduced in the codebase by contributors, not by a TSC decision; similarly, they can be removed by contributors. It's not up to the TSC – unless Node.js collaborators cannot get to a consensus when reviewing a PR, only then a TSC vote can resolve the situation.
That could be true for anything, that doesn't sound like a good argument. |
We seem to have consensus that it’s harder to work in primordials than not. At least most people feel that way, and while some may say primordials make no difference I doubt anyone would argue that primordials make contributing easier. So it’s just a question of whether the cost to developer productivity (and indirectly, to contributions generally) is worth the benefit. In my view, if we don’t use primordials everywhere—and it seems like we don’t, or can’t—then we’re not getting any security benefit. A fence with a hole in it isn’t much better than no fence at all. So I don’t see what benefit we’re getting out of primordials. If we thought that there was a potential future where we could get to 100% primordials usage—a fence with no holes—then at least there’s a decision to be made as to whether the security benefit is worth the productivity cost. But if that’s not on the horizon, then it seems like we’re paying a big cost and getting nothing in return. |
@GeoffreyBooth maybe you missed it, but as it was said several time in this thread, primordials are not a security feature. If Node.js code were using 100% primordials, there would still be holes, because that's how the ECMAScript spec is written. Primordials are a tool, sometimes useful, sometimes not. “if we can't have it everywhere, let's have it nowhere“ is not a good take, the reality is more nuanced than that. |
I mean, their purpose is to prevent user code from overriding prototype methods and gaining access to Node internals, right? For what reason would we do that other than security? Or put another way, what benefit are we getting out of primordials? Regardless of whether it’s worth the cost; what is the gain? |
I recommend reading Joyee's comment to get context and informed critic of primordials: #1438 (comment) If you want a TL;DR, the short answer to your question is
|
I don’t really see “As a user, I want to be able to delete methods from prototypes and still have everything function correctly” as a use case that we need to support. This feels like an extremely minimal gain that few users have ever benefited from; was there an issue opened before primordials were added where anyone complained about Node’s pre-primordials behavior in this regard? Even if there are a few users out there who appreciate this new behavior, I think the cost of continuing to use primordials is much too costly to us in terms of time of effort and decreased contributions to justify maintaining that UX. |
I don't think there's an easy keyword you can use but nodejs/node#18795 can give you some clues e.g. nodejs/node#12956 , which leads to a pretty convincing point (at least for me): if I can do it in Chrome, why can't I do it in Node.js? Is it because Node.js is not as robust as Chrome? (I guess it's okay to just give up and say, "yeah we are not as robust as Chrome", but just to give some history about why this thing exist). |
@GeoffreyBooth here is a robustness argument:
The example is contrived but I've seen that in real code, debugging and error flows are places we should strongly favor developer experience over performance. Additionally in web standards we literally have to use them in order to meet the web standard, we can decide to be non-compliant but I think without extreme justification that's a hard pill to swallow. |
@benjamingr You're right, but the examples you've shared are edge cases that most functions in Node.js doesn't need. Unless needed, such as the examples you've provided, we don't need them. Removing everything except the requirements for the web standards might be a solution I'm ok with. But I seriously don't think, we should add them as a |
The test runner is another example of a place that benefits from primordials. Tests often include misbehaving/abnormal/monkey-patched code. In an ideal world, tests should not bring down the test runner. |
Another example is The ESM loader also relies on other built-in modules, and we would want to close down the door to any monkey-patchability, primordials are still necessary there. |
I'm pretty confident that without primordials, I'd be able to access virtually every internally used JS object and create a security issue. The reliability of the platform rests on whether the platform's code is able to establish boundaries or not, and without primordials or a similar approach, the platform largely won't be able to establish any. |
Do any of express, fastify, axios, lodash, next etc. have any concept of something like primordials? If not, then by this bar 99% of users already have security issues that primordials won't solve. |
While that is probably true, I don't know if that alone would qualify as a vulnerability under Node's threat model. |
@Linkgoron that's fine, those are userland packages. I'm talking about the platform itself, for which a different bar applies. Userland packages aren't very fast, either, but node still cares about performance. |
@cjihrig i'd have to actually try to do that auditing work to be sure, of course, but i think it's very likely a number of those leakages would indeed comprise a security issue. |
My claim is that if 99.99% of code is (supposedly) already vulnerable, primordials don't really solve security concerns, and never will. It has also been stated over and over in this thread that primordials don't exist to solve security issues. Hell, JavaScript "itself" breaks by replacing stuff like the following: Array.prototype[Symbol.iterator] = () => { throw new Error('zz'); }
const [a,b,c] = [1,2,3]; which also breaks tons of other things iirc like default constructors or super or whatever, which is why Node (correctly) defines a lot of empty default constructors for classes. The platform should be considered in a higher standard than userland code (cases for when it's needed have already been stated in this thread), but not in a all-or-nothing way, and not in a way that is a detriment to Node's own development. |
I think this discussion is getting a bit frustrating to people and since several people from both sides of the debate reached out to me. I'd like to suggest an alternative.: Here is an idea, parts suggested by @joyeecheung :
I'm happy to drive this as a strategic initiative if there is interest from the TSC with the following format:
I would really like it if collaborators who have spent a lot of time thinking about this participate, namely @aduh95 , @ronag @joyeecheung and @anonrig but there are a lot of others here like @isaacs and @ljharb that are welcome too weigh in and participate. I think our opinions don't differ that much honestly and I think once we iron out the specifics we can come to a good resolution. @nodejs/tsc |
Opened an issue #1439 |
For what it's worth, I remember bringing up the readability issues and barrier to contributors when we discussed the same thing, years ago. We had plenty of discussions with the same arguments back then. Since then, Node.js has implemented some security features that partially rely on the integrity of the JS implementation (namely, policies and permission model). This is an area that requires a lot of attention w.r.t. tampering. |
@tniessen from my understanding policies explicitly do not make a "no API tampering guarantee":
I cannot go into detail here since the discussion happened in a security-related repo rather than a public one but you've raised some of these concerns about it yourself. As for permissions yes that code should absolutely use primordials and ideally be pen-tested and audited since it's a security feature - though I think it should be audited by a security professional closely before moving out of experimental anyway. |
(That's not supposed to break default constructors, as of tc39/ecma262#2216, but V8 still hasn't fixed it.) |
Oh hey, big discussion. For some historical context, @GeoffreyBooth:
Oh dear deity, yes. As early as 2010 (the node v0.1.x days), and not just once but again and again. Things like the REPL would fall over if you so much as looked at them wrong. Its multi-context support (i.e. give it its own global objects) was the first, if not altogether successful, attempt at fixing that. Not many people remember this but node at one point even supported loading each module in its own context. Worked about as well as you would expect it to and is why I removed it back in 2015. :-) |
When primordials are removed, would an |
How would a user call these checks? What do these checks report? Not all intrinsics modifications are bad. Technically a polyfill adding a new method on an existing prototype is changing the intrinsics. Polyfills can similarly replace intrinsics to support a new feature or fix a bug. |
I'm going to go ahead and close this issue please feel free to open issues or raise concerns in https://github.com/nodejs/primordials-use-cases |
History
Use primordials in the error path only if there are no significant perf regression.
Proposal
Due to the performance and several other concerns I propose removing all primordials from the Node.js project. Remember that this is a living document and I'll try to update the following list upon feedback from the community and the TSC.
Pros
Cons
security
in Node.jsin reality
to regress over the performance.benchmarking.nodejs.org
but IMHO that's fighting fire with fire.known performance issues
but we never follow them. For example, if you search forArrayPrototypePush
in Node.js core, you'll find a lot of examples and usages of primordials withknown peformance issues
. Reference to the documentation: https://github.com/nodejs/node/blob/HEAD/doc/contributing/primordials.md#primordials-with-known-performance-issues.primordials.md
document is 775 lines on top of our existing documentation, and we don't follow them even though it has explicit examples of what NOT to use - https://github.com/nodejs/node/blob/main/doc/contributing/primordials.mdproposal-symbol-proto
for mitigating prototype pollution.Suggestion
I recommend the removal of primordials from the Node.js project. I strongly believe that it causes more harm then the intended behavior effecting the future of the project causing performance regressions and friction for new contributors.
Appreciate, if you can leave a 👍 or 👎 to this issue.
cc @nodejs/tsc @gireeshpunathil @mcollina @ljharb @BridgeAR
The text was updated successfully, but these errors were encountered: