Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removing primordials from Node.js project #1438

Closed
anonrig opened this issue Sep 11, 2023 · 68 comments
Closed

Removing primordials from Node.js project #1438

anonrig opened this issue Sep 11, 2023 · 68 comments

Comments

@anonrig
Copy link
Member

anonrig commented Sep 11, 2023

History

Proposal

Due to the performance and several other concerns I propose removing all primordials from the Node.js project. Remember that this is a living document and I'll try to update the following list upon feedback from the community and the TSC.

Pros

  • It ensures that Node.js internals are not tampered or acceptable to prototype pollutions.

Cons

  • Any prototype pollution in Node.js core is not accepted as a vulnerability in our security/bug bounty program.
    • We use primordials with the purpose of security, but does not include it in our bounty program, which contradicts the notion of security in Node.js
  • It creates a friction for optimizing performance, and requires the contributor to benchmark the code with and without the primordials, causing in reality to regress over the performance.
    • One solution for this is to have a performance regression catcher like the benchmarking.nodejs.org but IMHO that's fighting fire with fire.
  • V8 doesn't always optimize the primordials efficiently and causes performance regressions, and what Turbofan optimizes are subject to change. This requires constant benchmarking for each V8 release.
  • It creates a higher bar for contribution to Node.js. The primordials.md document is 775 lines on top of our existing documentation, and we don't follow them even though it has explicit examples of what NOT to use - https://github.com/nodejs/node/blob/main/doc/contributing/primordials.md
    • The issue of disallowing certain primordials can be solved with linter rules, but IMHO that's fighting fire with fire.
  • There is an existing proposal in TC39 called proposal-symbol-proto for mitigating prototype pollution.

Suggestion

I recommend the removal of primordials from the Node.js project. I strongly believe that it causes more harm then the intended behavior effecting the future of the project causing performance regressions and friction for new contributors.


Appreciate, if you can leave a 👍 or 👎 to this issue.

cc @nodejs/tsc @gireeshpunathil @mcollina @ljharb @BridgeAR

@RafaelGSS
Copy link
Member

Can we have a draft to see exactly how much performance improvement are we talking about?

@aduh95
Copy link
Contributor

aduh95 commented Sep 11, 2023

I don't think this proposal is reasonable, primordials are a useful tool for reliability of the software, and most of the time have no impact on performance. Also as you mentioned, the TSC has already voted for the use of primordials in the error path, so your proposal should be compatible with that vote.

Any prototype pollution in Node.js core is not accepted as a vulnerability in our security/bug bounty program.

That's not a con, it's just a fact.

  • We use primordials with the purpose of security, but does not include it in our bounty program, which contradicts the notion of security in Node.js

We do not use primordials with the purpose of security.

  • We have a section that mentions known performance issues but we never follow them.

It's really unfair to state "we never follow them", in reality those changes tend to be watched carefully and not land if perf regression is shown.

For example, if you search for ArrayPrototypePush in Node.js core, you'll find a lot of examples and usages of primordials with known peformance issues.

Well what does it tell you? Not every part of lib/ is on a hot path, sometimes reliability is more important than performance.

  • The issue of disallowing certain primordials can be solved with linter rules, but IMHO that's fighting fire with fire.

It's already in place, we have the avoid-prototype-pollution rule that forbid the use of some primordials.

That's not a con either.

@benjamingr
Copy link
Member

I feel strongly that we should keep primordials when it is required for web standards (for example EventTarget should be reasonably tamper proof because not doing so would violate spec semantics).

I also feel strongly we should keep primordials in "critical" error paths that must never throw.

I am +1 on removing primordials everywhere else, multiple people mentioned they are a burden to contribution and they make the code less readable, debuggable and fun to work on, even if performance was not an issue. They create friction in PRs, cause CI reruns and often needless churn. To be clear - it's not that they're not valuable it's that their value is greatly outweighed by their burden IMO.


It's important to echo @aduh95 's point - primordials are about reliability and are not a security mechanism, it does not provide tamper proofing nor does it sandbox anything.


@aduh95 the fact a TSC vote happened doesn't mean the TSC can't change its mind? Personally I'm a lot less pro-primordials than I was 2 years ago given feedback people several people gave me about how it impacts their contribution experience. Also the TSC itself changes in composition.

@ljharb
Copy link
Member

ljharb commented Sep 11, 2023

A JS platform that falls apart when someone does a normal JavaScript action would be an embarrassment.

Primordials may not be the best or only way to solve this problem, but simply not solving it is something I would hope is deemed wildly unacceptable.

@aduh95
Copy link
Contributor

aduh95 commented Sep 11, 2023

@aduh95 the fact a TSC vote happened doesn't mean the TSC can't change its mind? Personally I'm a lot less pro-primordials than I was 2 years ago given feedback people several people gave me about how it impacts their contribution experience. Also the TSC itself changes in composition.

I'm not sure, this has never happened before AFAICT. I still think Yagiz should try to draft a proposal that's compatible with it, or at least that does not overlap with it, so the discussion can be more productive. Depending on the issue of that proposal, you could consider overriding the previous vote, but now's not the time IMO.

@ronag
Copy link
Member

ronag commented Sep 11, 2023

A JS platform that falls apart when someone does a normal JavaScript action would be an embarrassment.

This is more of an opinion than a statement of fact. IMHO even though there are clear benefits, the cons outweigh the pros.

@anonrig I don't think the current consensus of only using primordial for the error path is too bad.

@benjamingr
Copy link
Member

I'm not sure, this has never happened before AFAICT

I think trying things and learning from them is a strength, the TSC was super opposed to promises at one point for example to the point people were actively mocking me for advocating them (maybe it was the CTC? it was a long time ago when that was still a thing :)). I change my mind around a lot of things personally though admittedly most of those things weren't a TSC vote.

I think it would be productive to focus on arguments for and against usage of primordials (generally and in specific cases) though to be clear if my PoV doesn't reach consensus (primordials in critical error paths and web standards and not elsewhere) I'd be fine with that.


A JS platform that falls apart when someone does a normal JavaScript action would be an embarrassment.

I don't think that's a productive statement especially given none of the userland packages use them. So I don't think we're giving a meaningful guarantee with them anyway. Additionally I don't like the language "would be an embarrassment" since "to whom?" and also it assumes other peoples' feelings about stuff.

Primordials may not be the best or only way to solve this problem, but simply not solving it is something I would hope is deemed wildly unacceptable.

The issue isn't "primordials solve nothing so let's remove them" it's "the new problems they create in our code and the friction they add are maybe not worth the thing they're solving for us".

@ronag
Copy link
Member

ronag commented Sep 11, 2023

@benjamingr +1 on that

@ljharb
Copy link
Member

ljharb commented Sep 11, 2023

@benjamingr all of my userland packages do, in fact, use them. I apologize for the choice of language; how better could i indicate how serious I think this is?

@benjamingr
Copy link
Member

@ljharb

all of my userland packages do, in fact, use them.

First of all - 😮 😅
Second of all - how? What if someone changes a builtin prototype before your packages' code is loaded?
Third of all - I suspect that for most popular packages that is pretty rare.

how better could i indicate how serious I think this is?

"Hey, I feel strongly that we should keep using primordials. They are beneficial because they prevent issues like A,B or C which are important to users and people have run into (e.g here are some issues people opened complaining about it D, E or F). I think the project would lose from removing them."

Basically just objective unambiguous facts about why you think your opinion is correct would have the best chances of impacting the peoples' (and the TSC's) opinion, ideally with numbers.

@ljharb
Copy link
Member

ljharb commented Sep 11, 2023

What if someone changes a builtin prototype before your packages' code is loaded?

the nature of javascript is such that you can't defend against that, but, if your code is first-run, you can cache everything you need.

how?

I use https://npmjs.com/call-bind and https://npmjs.com/get-intrinsic to achieve it.

pretty rare

I agree! but that doesn't mean that node shouldn't be robust against that. Lots of use cases for node have zero dependencies - especially with the test runner, where deleting builtins is actually a more common thing to simulate different environments and test an implementation against it, and robustness in that case is important

"…"

Thanks, I appreciate that feedback and will try to incorporate it.

@aduh95
Copy link
Contributor

aduh95 commented Sep 11, 2023

@benjamingr it's all fine if TSC members change their mind, I do that all the time too, highly recommend it, past Antoine was a complete idiot most of the time. Still the TSC decision needs to be somewhat final otherwise we cannot move forward. I don't know if the TSC charter says anything about it, but anyway, it's going to create off-topic discussions to try to override an existing vote, I think we can discuss primordials while accepting the outcome of the previous vote.

My personal opinion on primordials is that it provides more upside than downsides; I don't know if it's because I lack empathy or something, but I don't think it creates a lot of friction in PRs, and it doesn't seem to me that it's a burden to contribution. If anything, I think it makes the code more fun and debuggable, but I've learned that I might be in the minority here.

@benjamingr
Copy link
Member

@ljharb

the nature of javascript is such that you can't defend against that, but, if your code is first-run, you can cache everything you need.

Yes but if you load "Package A" that changes a prototype then "Package B" that comes following it will get a polluted prototype. This can actually be useful (e.g. for instrumentation) and it would make certain use cases impossible (e.g. people writing instrumentation "deeply" hooking a server library).

I do think it would be very helpful to be able to provide this guarantee (though for me it's only meaningful if it's a guarantee for actual user code which is likely to use many dependencies). It would be ideal for the project to promote a proposal to address this at the ECMAScript level (which is where I believe this naturally belongs) rather than attempt to ship "semi correct" approaches that provide very fragile guarantees.

@aduh95

@benjamingr it's all fine if TSC members change their mind, I do that all the time too, highly recommend it, past Antoine was a complete idiot most of the time. Still the TSC decision needs to be somewhat final otherwise we cannot move forward. I don't know if the TSC charter says anything about it, but anyway, it's going to create off-topic discussions to try to override an existing vote, I think we can discuss primordials while accepting the outcome of the previous vote.

The TSC vote about primordials was a year and a half ago, quite a respectable amount of time in software I think? I'm personally in favor of keeping primordials in error paths (and was not part of that vote) but I think it's fine to raise the issue again and see if the opinion of folks changed.

My personal opinion on primordials is that it provides more upside than downsides; I don't know if it's because I lack empathy or something, but I don't think it creates a lot of friction in PRs, and it doesn't seem to me that it's a burden to contribution. If anything, I think it makes the code more fun and debuggable, but I've learned that I might be in the minority here.

I acknowledge that for some people (like yourself and probably others) they don't create a burden to contribution. I don't understand how they make stuff more fun or debuggable though :D

I've had several contributors complain about it to me in the past. I personally find them a bit tedious (a bit more so recently) but I've reviewed, approved and helped land a bunch of primordials changes before that perspective shift. Even before that it was "something I put up with to contribute" like GYP, different/weird debugging/testing flows etc.

@ljharb
Copy link
Member

ljharb commented Sep 11, 2023

@benjamingr yes, that's why the first thing you load in an application, to be safe, should be the code that sets up the environment - shims/polyfills, things that can be cached like get-intrinsics, etc.

I 100% believe this should be handled in TC39 itself, but I don't think node should give up this important robustness property in the meantime.

@JakobJingleheimer
Copy link

I would love to not need primordials (not least because they're confounding to new contributors—if Antoine wasn't here, we'd be kinda screwed). However, they provide a very important function that I think we can't just decide to stop caring about. If we had some way to not need them, I'm all ears!

@GeoffreyBooth
Copy link
Member

I don’t think the current consensus of only using primordial for the error path is too bad.

This is the rule we’re supposed to be following? I thought I was supposed to use primordials everywhere, for any method for which we had a primordial available. You’re telling me I could have limited primordials to just error paths this whole time?

@mcollina
Copy link
Member

Any prototype pollution in Node.js core is not accepted as a vulnerability in our security/bug bounty program.

This is incorrect. Any prototype pollution is accepted as a vulnerability by our threat model, i.e. causing a modification of a prototype as a side effect of calling a function in Node.js core. However, internals that are not hardened against a prototype pollution that already happened are not vulnerabilities, because they are unexploitable. Hopefully this distinction is clear.

@isaacs
Copy link

isaacs commented Sep 11, 2023

Primordials are a "least bad" solution, and even then, only because there are so few solutions available for the problem they're intended to solve.

Reading through this, and some of the other discussions around primordials, and having done some work to in the efforts to get minipass and glob into core, I think a few things should be uncontroversial. (If you disagree, or think I'm overstating something or making an unfair generalization, I'm curious about it! That's not my intent.)

  • Primordials present a significant barrier to contribution for many JavaScript authors, even some who are very experienced, skilled, and could provide value to the Node project.
  • It would usually be bad if userland programs could modify intrinsic objects and cause the Node platform to behave incorrectly. (There are some cases where it's surprising that modifying an intrinsic used by Node isn't reflected in the platform behavior, but on balance, we'd all usually prefer "you can't change it" over "you can change it by accident".)
  • Primordials are not a security feature per se, but they do provide some protection against some classes of security issues, so the security benefit isn't nothing.
  • The documentation around how, why, and when to use them, and the situations where those rules can be bent or broken, are subtle and few people fully understand them (especially new contributors). Eg, when it says they shouldn't be used on "hot paths", or that sometimes reliability is more important than performance, ok, how is that determined? How hot is "hot"?

I don't know of any alternative way to achieve the goals, but there are things that could maybe be explored to reduce the costs, which I've heard suggested before:

  • Develop a transpiler that turns "normal" JS into primordial-ized JS with a build tool. No small task, but something that could be done entirely on node's side.
  • Work with the V8 team (or even TC39) to develop an API or language feature to defend against intrinsic object mutation. Many more stakeholders, unbounded time to delivery.
  • Increase coverage of benchmark testing, establish objective documented criteria for exactly how hot-path something must be to justify using primordials, and enforce this with CI tooling, so at least primordial usage is clear and minimized as much as possible.
  • Accept that if someone breaks their platform, then their platform will be broken, and just stop using primordials. This doesn't provide the value that primordials provides, but omg, so cheap and easy, zero maintenance at all.

Primordials are a low-value local-maximum solution; just good enough to keep better solutions from being explored, but not actually very good. None of the better solutions are particularly easy or satisfying, either, so the moat around it is deep.

@ljharb
Copy link
Member

ljharb commented Sep 11, 2023

I'd also like to note that I have been pursuing multiple proposals, over years, and with years to come, with the goal of making primordial-esque safety much more ergonomic for JS developers - and I've used (and will be using) node's primordials (and their UX difficulties) as a big part of the motivation.

In other words, I don't think this primordials approach is going to be necessary in node forever, but I think it should remain until there's a better option.

@aduh95
Copy link
Contributor

aduh95 commented Sep 11, 2023

@isaacs I mostly agree with your summary and think it represents the opinion of the majority fairly. If I may, I'd like to note a few things regarding your suggestions:

  • Develop a transpiler that turns "normal" JS into primordial-ized JS with a build tool. No small task, but something that could be done entirely on node's side.

I don't think it's achievable, otherwise we would have done it. It might possible with some stricter variant of TypeScript, but inferring from "normal" JS what primordials to use is not possible.

  • Work with the V8 team (or even TC39) to develop an API or language feature to defend against intrinsic object mutation. Many more stakeholders, unbounded time to delivery.

Regarding TC39, if you read through tc39/proposal-call-this#8 it seems they won't be interesting in offering new syntax for a problem that's deemed too niche.
Reaching out to V8 to improve the performance of primordials sounds good, if there's an issue open on their issue tracker; I don't know if they would be interested in developing and maintain an API if they don't have use for it – it would be pretty cool to have though.

  • Increase coverage of benchmark testing, establish objective documented criteria for exactly how hot-path something must be to justify using primordials, and enforce this with CI tooling, so at least primordial usage is clear and minimized as much as possible.

It seems a bit contradictory: either we want primordials usage maximized as much as possible, and we use benchmark CIs to detect when it's not a good idea; either we want it minimized as much as possible, and benchmarks are not very important.
But in any case, I'm all for clarification.

@GeoffreyBooth
Copy link
Member

It seems a bit contradictory: either we want primordials usage maximized as much as possible, and we use benchmark CIs to detect when it’s not a good idea; either we want it minimized as much as possible, and benchmarks are not very important.

Either primordials are necessary for security or they’re not. If there are certain parts of Node where we’re like, “nah, don’t bother with primordials here because this code is too important,” then why use them anywhere? Only some parts of Node are worth protecting from intrusion, and not others?

@anonrig
Copy link
Member Author

anonrig commented Sep 12, 2023

I agree with @GeoffreyBooth 100%.

@ljharb
Copy link
Member

ljharb commented Sep 12, 2023

I totally agree that all parts should be covered by primordials.

@Qard
Copy link
Member

Qard commented Sep 12, 2023

It's worth considering that the cost of primordials is uneven.

A lot of the perf impact is in the prototype methods that get wrapped in a closure so it can look like a regular function that shifts it's "this" value into the first parameter position. Would the perf be better if prototype primordials instead just were the original function and had to be called with ReflectApply in-place?

The other one is the Safe* types--do we know the perf difference between building those versus just using the primordial functions on the plain types? I suppose with those types we want the objects to remain tamper proof after handing them to user code? Can we achieve that correctly by freezing the objects we expose instead?

@Linkgoron
Copy link
Member

Linkgoron commented Sep 12, 2023

Either primordials are necessary for security or they’re not. If there are certain parts of Node where we’re like, “nah, don’t bother with primordials here because this code is too important,” then why use them anywhere? Only some parts of Node are worth protecting from intrusion, and not others?

Yes, of course some parts are more critical or sensitive than others. You might want to protect something like EventTarget, or process.exit or default constructors, but other performance-critical parts would be better off being more performant or readable instead. The project trusts the expertise of the team members to weigh the pros/cons in a lot of situations. Trade-off are used in a lot of parts of code (in general), and primordials (which I very much dislike. Their pros are IMO overblown and cons downplayed) are no different.

@ronag
Copy link
Member

ronag commented Sep 12, 2023

Just my two cents. While both performance and avoiding prototype pollution are important. I think contributor accessibility and ergonomics are very important. People work on Node mostly for free because they find it fun and interesting. If we don't value that we will turn away developers and stagnate. That being said, I'm sure that there are developers working on Node because they think primordials is an important and interesting topic. But I'm not sure that's where the majority lies.

While we haven't yet done a post-mortem on what motivated Bun over working on improving Node, I personally believe that If we don't keep Node interesting and accessible for contributors we will end up losing valuable resources and work to alternatives like Bun, Deno etc...

@mcollina
Copy link
Member

Summarizing:

  1. Some primordials (as we have implemented them), cause a significant addition in overhead, making a 100% coverage too costly in terms of performance.
  2. Even a 100% coverage hardening would not allow us to give more guarantees inside our threat model, because the user code (including dependencies) could still be vulnerable.
  3. Very few collaborators like using primordials.

@RaisinTen
Copy link
Contributor

Practically, I think it is unlikely that we will be able to remove all primordials. It might be better if we solve this problem on a case-by-case basis because there are places where it makes sense to use primordials and there are places where it doesn't.

If we find a part of the code that is slow because of primordials, I would suggest removing uses of primordials from there and present the benchmark results. If the reviewers are convinced that the performance improvement is worth more than the robustness provided by the primordials, the change is going to land.

I agree that the usage of primordials does make it hard for contributors but I don't think it's something that would prevent folks from contributing. Anyone who is not familiar with primordials can submit a PR that doesn't use primordials at all and collaborators can guide them to use primordials appropriately.

@aduh95
Copy link
Contributor

aduh95 commented Sep 12, 2023

If we do want to have the security of primordials, I think we should implement those functions in C++, not keep primordials.

That's easier said than done, it's not something the TSC can decide, it can only happen if contributions are made to make it happen. Primordials were introduced in the codebase by contributors, not by a TSC decision; similarly, they can be removed by contributors. It's not up to the TSC – unless Node.js collaborators cannot get to a consensus when reviewing a PR, only then a TSC vote can resolve the situation.
That being said, I'm not convinced moving to C++ will lower the bar for contribution. It would definitely do a better job than primordials at protecting Node.js internals against global mutations.

This doesn't mean that it will continue to do that with the next major release, and we don't have the resources & benchmarks to validate it on every release.

That could be true for anything, that doesn't sound like a good argument.

@GeoffreyBooth
Copy link
Member

We seem to have consensus that it’s harder to work in primordials than not. At least most people feel that way, and while some may say primordials make no difference I doubt anyone would argue that primordials make contributing easier. So it’s just a question of whether the cost to developer productivity (and indirectly, to contributions generally) is worth the benefit.

In my view, if we don’t use primordials everywhere—and it seems like we don’t, or can’t—then we’re not getting any security benefit. A fence with a hole in it isn’t much better than no fence at all. So I don’t see what benefit we’re getting out of primordials. If we thought that there was a potential future where we could get to 100% primordials usage—a fence with no holes—then at least there’s a decision to be made as to whether the security benefit is worth the productivity cost. But if that’s not on the horizon, then it seems like we’re paying a big cost and getting nothing in return.

@aduh95
Copy link
Contributor

aduh95 commented Sep 12, 2023

@GeoffreyBooth maybe you missed it, but as it was said several time in this thread, primordials are not a security feature. If Node.js code were using 100% primordials, there would still be holes, because that's how the ECMAScript spec is written. Primordials are a tool, sometimes useful, sometimes not. “if we can't have it everywhere, let's have it nowhere“ is not a good take, the reality is more nuanced than that.

@GeoffreyBooth
Copy link
Member

@GeoffreyBooth maybe you missed it, but as it was said several time in this thread, primordials are not a security feature.

I mean, their purpose is to prevent user code from overriding prototype methods and gaining access to Node internals, right? For what reason would we do that other than security?

Or put another way, what benefit are we getting out of primordials? Regardless of whether it’s worth the cost; what is the gain?

@aduh95
Copy link
Contributor

aduh95 commented Sep 12, 2023

I recommend reading Joyee's comment to get context and informed critic of primordials: #1438 (comment)

If you want a TL;DR, the short answer to your question is

I do not view primordials as a proper security mechanism, I see them as a way to enhance user experience (i.e. you don't want to delete Array.prototype.something and then blow up the REPL). Sure, not having 100% coverage means Node.js may still break under certain input, but as all user experience improvements it can always just be a best-effort thing.

@GeoffreyBooth
Copy link
Member

you don’t want to delete Array.prototype.something

I don’t really see “As a user, I want to be able to delete methods from prototypes and still have everything function correctly” as a use case that we need to support. This feels like an extremely minimal gain that few users have ever benefited from; was there an issue opened before primordials were added where anyone complained about Node’s pre-primordials behavior in this regard?

Even if there are a few users out there who appreciate this new behavior, I think the cost of continuing to use primordials is much too costly to us in terms of time of effort and decreased contributions to justify maintaining that UX.

@joyeecheung
Copy link
Member

joyeecheung commented Sep 12, 2023

was there an issue opened before primordials were added where anyone complained about Node’s pre-primordials behavior in this regard?

I don't think there's an easy keyword you can use but nodejs/node#18795 can give you some clues e.g. nodejs/node#12956 , which leads to a pretty convincing point (at least for me): if I can do it in Chrome, why can't I do it in Node.js? Is it because Node.js is not as robust as Chrome? (I guess it's okay to just give up and say, "yeah we are not as robust as Chrome", but just to give some history about why this thing exist).

@benjamingr
Copy link
Member

@GeoffreyBooth here is a robustness argument:

  • Take the promise unhandled rejection error handling code for example.
  • Say the user used a library that incorrectly polyfilled Error.prototype.stack, let's say to add more content there
  • The implementation has an edge case that throws an error when populating that property.
  • The user runs into the edge case, a promise is rejected and ends up in the "possibly unhandled rejections" list.
  • Without primordials in the error path, if the error path happens to create an error and that throws the user gets a super confusing error without a stack trace (and at the wrong time, preventing the user from handling the error!)
  • With them they get an error indicating where the actual failure was.

The example is contrived but I've seen that in real code, debugging and error flows are places we should strongly favor developer experience over performance.

Additionally in web standards we literally have to use them in order to meet the web standard, we can decide to be non-compliant but I think without extreme justification that's a hard pill to swallow.

@anonrig
Copy link
Member Author

anonrig commented Sep 12, 2023

@benjamingr You're right, but the examples you've shared are edge cases that most functions in Node.js doesn't need. Unless needed, such as the examples you've provided, we don't need them. Error.prototype is one thing that is a good example and a bad one, because primordials in Error has the highest impact if we're talking performance wise.

Removing everything except the requirements for the web standards might be a solution I'm ok with. But I seriously don't think, we should add them as a case-by-case basis, because it's open to interpretation, and create issues like this one.

@cjihrig
Copy link
Contributor

cjihrig commented Sep 12, 2023

The test runner is another example of a place that benefits from primordials. Tests often include misbehaving/abnormal/monkey-patched code. In an ideal world, tests should not bring down the test runner.

@aduh95
Copy link
Contributor

aduh95 commented Sep 12, 2023

Another example is console.log, it's tempting to override built in method to add console.log calls to understand what path the code follows; the problem is that console.log relies heavily on other built-in modules (node:stream, node:util), which themselves rely on other modules, etc.

The ESM loader also relies on other built-in modules, and we would want to close down the door to any monkey-patchability, primordials are still necessary there.

@ljharb
Copy link
Member

ljharb commented Sep 12, 2023

I'm pretty confident that without primordials, I'd be able to access virtually every internally used JS object and create a security issue. The reliability of the platform rests on whether the platform's code is able to establish boundaries or not, and without primordials or a similar approach, the platform largely won't be able to establish any.

@Linkgoron
Copy link
Member

I'm pretty confident that without primordials, I'd be able to access virtually every internally used JS object and create a security issue. The reliability of the platform rests on whether the platform's code is able to establish boundaries or not, and without primordials or a similar approach, the platform largely won't be able to establish any.

Do any of express, fastify, axios, lodash, next etc. have any concept of something like primordials? If not, then by this bar 99% of users already have security issues that primordials won't solve.

@cjihrig
Copy link
Contributor

cjihrig commented Sep 12, 2023

I'd be able to access virtually every internally used JS object and create a security issue.

While that is probably true, I don't know if that alone would qualify as a vulnerability under Node's threat model.

@ljharb
Copy link
Member

ljharb commented Sep 12, 2023

@Linkgoron that's fine, those are userland packages. I'm talking about the platform itself, for which a different bar applies. Userland packages aren't very fast, either, but node still cares about performance.

@ljharb
Copy link
Member

ljharb commented Sep 12, 2023

@cjihrig i'd have to actually try to do that auditing work to be sure, of course, but i think it's very likely a number of those leakages would indeed comprise a security issue.

@Linkgoron
Copy link
Member

Linkgoron commented Sep 12, 2023

@Linkgoron that's fine, those are userland packages. I'm talking about the platform itself, for which a different bar applies. Userland packages aren't very fast, either, but node still cares about performance.

My claim is that if 99.99% of code is (supposedly) already vulnerable, primordials don't really solve security concerns, and never will. It has also been stated over and over in this thread that primordials don't exist to solve security issues. Hell, JavaScript "itself" breaks by replacing stuff like the following:

Array.prototype[Symbol.iterator] = () => { throw new Error('zz'); }
const [a,b,c] = [1,2,3];

which also breaks tons of other things iirc like default constructors or super or whatever, which is why Node (correctly) defines a lot of empty default constructors for classes. The platform should be considered in a higher standard than userland code (cases for when it's needed have already been stated in this thread), but not in a all-or-nothing way, and not in a way that is a detriment to Node's own development.

@benjamingr
Copy link
Member

I think this discussion is getting a bit frustrating to people and since several people from both sides of the debate reached out to me. I'd like to suggest an alternative.:

Here is an idea, parts suggested by @joyeecheung :

  • We add tests to flows we care about being guarded with primordials, this includes stuff like process exit handlers that should always fire and cleanup that should always happen.
  • We add web platform tests or utilize existing ones to enforce primordials in web standards where required.
  • We build consensus around not adding primordials where we believe it creates a negative impact on contribution which is most other places.
  • We should update the docs both internal and external.

I'm happy to drive this as a strategic initiative if there is interest from the TSC with the following format:

  • We (all members invited) will meet a few times to write use cases and then derive tests from those use cases that will run during CI to enforce we don't break any primordials guarantees (e.g exit handler) . We will try to keep the scope small.
  • We will add a warning to the docs outlining the robustness model regarding primordials and what guarantees Node makes. We will make a recommendation and an official "Node wants X" for TC39 discussion.
  • We will recommend where to enforce primordials and where not to enforce them and what the PR flow around them should be.
  • The TSC will approve every step above either by consensus (good) or by vote (bad).
  • All this should happen in the timespan of a month or two, ideally. We rush nothing, hear everyone out and adapt/amend when needed.

I would really like it if collaborators who have spent a lot of time thinking about this participate, namely @aduh95 , @ronag @joyeecheung and @anonrig but there are a lot of others here like @isaacs and @ljharb that are welcome too weigh in and participate.

I think our opinions don't differ that much honestly and I think once we iron out the specifics we can come to a good resolution.

@nodejs/tsc

@benjamingr
Copy link
Member

Opened an issue #1439

@tniessen
Copy link
Member

For what it's worth, I remember bringing up the readability issues and barrier to contributors when we discussed the same thing, years ago. We had plenty of discussions with the same arguments back then.

Since then, Node.js has implemented some security features that partially rely on the integrity of the JS implementation (namely, policies and permission model). This is an area that requires a lot of attention w.r.t. tampering.

@benjamingr
Copy link
Member

@tniessen from my understanding policies explicitly do not make a "no API tampering guarantee":

Redirection does not prevent access to APIs through means such as direct access to require.cache or through module.constructor which allow access to loading modules. Policy redirection only affects specifiers to require() and import. Other means, such as to prevent undesired access to APIs through variables, are necessary to lock down that path of loading modules.

I cannot go into detail here since the discussion happened in a security-related repo rather than a public one but you've raised some of these concerns about it yourself.

As for permissions yes that code should absolutely use primordials and ideally be pen-tested and audited since it's a security feature - though I think it should be audited by a security professional closely before moving out of experimental anyway.

@bakkot
Copy link

bakkot commented Sep 16, 2023

Hell, JavaScript "itself" breaks by replacing stuff like the following:

Array.prototype[Symbol.iterator] = () => { throw new Error('zz'); }
const [a,b,c] = [1,2,3];

which also breaks tons of other things iirc like default constructors or super or whatever [emphasis added], which is why Node (correctly) defines a lot of empty default constructors for classes

(That's not supposed to break default constructors, as of tc39/ecma262#2216, but V8 still hasn't fixed it.)

@bnoordhuis
Copy link
Member

Oh hey, big discussion. For some historical context, @GeoffreyBooth:

I don’t really see “As a user, I want to be able to delete methods from prototypes and still have everything function correctly” as a use case that we need to support. This feels like an extremely minimal gain that few users have ever benefited from; was there an issue opened before primordials were added where anyone complained about Node’s pre-primordials behavior in this regard?

Oh dear deity, yes. As early as 2010 (the node v0.1.x days), and not just once but again and again.

Things like the REPL would fall over if you so much as looked at them wrong. Its multi-context support (i.e. give it its own global objects) was the first, if not altogether successful, attempt at fixing that.

Not many people remember this but node at one point even supported loading each module in its own context. Worked about as well as you would expect it to and is why I removed it back in 2015. :-)

@muescha
Copy link

muescha commented Sep 26, 2023

When primordials are removed, would an integrityCheck or a primordialsCheck be of interest? These checks would ascertain whether the prototypes have been altered by the user. This way, bugs stemming from prototype modifications can be more accurately identified. Users should always have the ability to run this check to determine if any changes have been made or if the code remains vanilla.

@mhofman
Copy link

mhofman commented Sep 26, 2023

would an integrityCheck or a primordialsCheck be of interest?

Users should always have the ability to run this check

How would a user call these checks? What do these checks report? Not all intrinsics modifications are bad. Technically a polyfill adding a new method on an existing prototype is changing the intrinsics. Polyfills can similarly replace intrinsics to support a new feature or fix a bug.

@benjamingr
Copy link
Member

I'm going to go ahead and close this issue please feel free to open issues or raise concerns in https://github.com/nodejs/primordials-use-cases

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests