-
Notifications
You must be signed in to change notification settings - Fork 67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Stage 3] Web APIs to be exposed to ShadowRealms #393
Comments
I'm still not convinced by the lack of I/O related APIs in ShadowRealm. It's fairly straightforward to get those removed if necessary for the use case, but on the other side, it's extremely onerous to implement over the callable boundary for use cases that require it (such as e.g. test runners). In particular, I am confused why secure context APIs would be wholesale disallowed. This is making an assumption in the nature of the code running inside the ShadowRealm, and like above, if these APIs present too much authority, it should be straightforward to remove them. I also suspect that access to origin state (indexdb, etc) would be valuable, but I admit I don't have use cases to provide. Conceptually workers and communications APIs would seem valuable as well, but I can see how hard it might be to tame those. I am worried the lack of Worker and communications APIs would similarly cause limitation for the test runners use case if there is no way to get transferable data in and out of a ShadowRealm. I agree that canvas / multimedia, etc. seem not needed. The question of taming exposed authority is an interesting one with the code being able to create new evaluations contexts (new shadow realms or workers). From experience it's difficult to implement such unescapable removals, and my hope would be to have support from the language in registering and executing "first run" code in newly created execution contexts. While that is definitely into the follow-on proposal space, I am not convinced we should prevent exposing the more powerful APIs today. |
picking some names from the last meeting notes and related issues: @dminor, @syg, @mgaudet, @msaboff, @gibson042, @littledan, @ljharb: PTAL and let me know if you have any questions. I'd deeply appreciate the time spent here and so we can move it forward. |
@mhofman I'm ultimately confused by your statement. The initial ask for this proposal has been consistently that we would work on virtualization and therefore we could even afford having a minimal subset of ECMAScript only globals in a ShadowRealm. The critical feature requested by the champions was to avoid unforgeables and ensuring the global names were configurable. The expansion we are presenting in this issue is an agreement with the champions, delegates, and Browser implementers to not make the instances of ShadowRealm a total alien to web reality and compatibility. We can always work to expand and improve this work, but right now my direction is towards unblocking it. I hope we can work together this way, or find a way we can constructively work together. |
I'll try to schedule some time to look at this in more detail in the next week or two. Initial feeling is one of surprise at the scope of inclusions, but I am very happy to see a rationale behind the selections (and happy to see PerformanceMark and friends no longer included). Thank you Leo for the ping. |
“secure context” means HTTPS, so in not sure why a shadow realm isn’t one - and either way, it’s already been established, i thought, that locking web features behind secure contexts prevents them from being widely adopted. |
@leobalter I am concerned about making this proposal useful to a variety of use cases. While the proposal does enable virtualization, and I agree a minimal subset of APIs would likely be sufficient for that, the cost of fully virtualizing a browser comes at a high complexity, and likely low performance. I was hoping that supporting more APIs would reduce the virtualization burden on users, and possibly enable more use cases for this proposal. I am fine with these browser APIs being postponned to future extensions, but the omissions here are being framed as intrinsic to the nature of these APIs, which is where I disagree. |
@mhofman The error in phrasing is mine. Please read things like "ShadowRealm code shouldn't be able to access the network" as "With this minimal feature set, ShadowRealm code shouldn't be able to access the network", not as pronouncements about the intrinsic usefulness. |
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
This comment was marked as resolved.
Even this seems problematic, this set of APIs makes it look like ShadowRealms are a secure environment for running untrusted code, which makes it likely people will mistakely use it as such and be stung later if authority carrying APIs are added. |
This comment was marked as resolved.
This comment was marked as resolved.
After giving this more thoughts, I think the baseline should be the WinterCG Minimum Common Web Platform API, minus the |
I also think some secure context APIs should be exposed, especially given that the position of at least some browsers is that all new APIs are secure context only, even things with no security implications, like Many secure-context APIs could and IMO should be exposed in shadow realms, especially the crypto related ones. |
As far as I can see, these are the APIs that would be added to Leo's previous proposal if adopting the WinterCG common minimum API, minus fetch.
|
Of that list, my concerns would be
My general principle is, things which are pure computation should be included, anything else deserves more skepticism. One caveat is that browsers may not want to expose secure-context-only APIs on insecure contexts, so those (e.g. |
Those seem fine, altho i'm not sure what the point of AbortSignal is if there's no fetch? (setTimeout and friends seem far more useful to me than console, fwiw) |
This list would be easier to review if there was a statement at the top explaining what the criteria are for inclusion and why (not in terms of spec stage, but rather the no-I/O thing). @bakkot 's explanation at #393 (comment) resonates for me, for example. It's good to see the tests listed above, but do these work in the wet harness for any browsers yet? It would be nice if the harness ran shadow realm tests when enclosed by each other type of global (e.g., shadowrealm inside window, shadowrealm inside worker). There should also be some nested shadow realm tests, to make sure it's possible to get to the recursive outer global where needed. |
I'm curious, why do you see |
They're not pure computation. (To be clear I'm not saying those must be left out, just that they need more thought than the others. I could see arguments either way.) |
ShadowRealm does not on its own provide a confidentiality boundary. All these APIs would remain configurable, and as such removable for use cases with confidentiality concerns. IMO that makes performance ok for inclusion. I could imagine setTimeout being a concern from an implementers pov given that a single scheduled callback could maintain the whole realm alive. But I suspect at some point we'll want to extend the amount of host features available in a ShadowRealm, which would bring the same concern. |
If the plan is to add more powerful APIs later, then sure, it would make sense to
|
ShadowRealms could do this anyway with // timeout after 5 seconds
await Atomics.waitAsync(new Int32Array(new SharedArrayBuffer(4)), 0, 0, 5000);
There is a substantial danger here that people treat ShadowRealm as some kind of sandbox for untrusted code, when it is expressibly no such thing. There's also no clear reason to follow such a principle when ShadowRealm is claimed to be designed for a wide variety of use cases, not just running hyper-deterministic code. Like if ShadowRealm ships with the list of APIs in the OP, people are absolutely going to see ShadowRealm and say "oh look JS finally added a sandbox" and then be bitten hard when they discover, oops |
AbortSignals can be used in |
@Jack-Works i don't see EventTarget (or anything that uses it) on that list, though. |
Yes, of course I have the same concern about
Sure there is: it is much, much easier to reason about code which only does pure computation, and it's much easier to do virtualization of pure programs because you are, by definition, providing them all impure capabilities. WASM works on the same principle. Since virtualization is the stated purpose of the proposal, that seems like an obvious line to draw, to me. Now, maybe we don't care about that goal, in which case, fine. But in that case, why not also provide network APIs? |
And I heard the global object of a shadow realm in Web is also an EventTarget. |
The proposal explainer clearly states not having I/O in shadow realms is a non-goal. As stated:
Plenty of the stated use cases (like testing) are made considerably worse if you have to reimplement often complex APIs (like fetch) for a shadow realm. Yes virtualization is one of the use cases, but this can accomplished regardless of initial presence of APIs. If people want a fully deterministic environment, this is something SES is specially meant to provide.
I have no idea why this specific list of APIs is being proposed, I fully would've expected ShadowRealms to support any API that is useful and isn't intrinsically tied to a specific global (e.g. I wouldn't expect DOM as it's tied to ShadowRealms have been sold as a non-confidentially, non-deterministic, non-sandbox mechanism for a long time now. This sudden detour to a sandbox-esque set of APIs is an unexpected change from that messaging. |
The main complications with network APIs like fetch are:
IMO these things are not fundamental concerns that block network APIs from ever being exposed in ShadowRealm, but it does warrant taking some time to explore the right approach. On the other side, I find setTimeout to be an API most JS developers would expect to be available. I do share @Jamesernator concern that providing computational only APIs will give a false sense of confidentiality sandboxing to applications (Date.now is enough to perform some spectre style attacks), that it doesn't sufficiently cover all motivating use cases, and that introducing the powerful APIs later will break too many assumptions of programs already using ShadowRealm. |
OK. If we in fact do not have as a goal that we only want pure computation, so that programs are easy to virtualize, then we should have a much, much broader list of APIs included. Personally I would be happier with the wasm model, but if this proposal wants to go in a different direction, that could be ok too. But in that case, people who don't like my suggested principle for choosing APIs should suggest a different one. I would not be inclined to leave out |
+1, this is the original route this proposal goes, this proposal used to have time, timezone & locale, Math.random() virtualization so even I always don't like the idea of any Web API being added to the ShadowRealm (cause this creates inconsistency in using it in Web or Node) but all others strongly want to add it. |
With Node and other platforms working on aligning with WinterCG's common minimum API, I'm optimistic that we can expose all APIs into their ShadowRealms which the Web puts in its ShadowRealms--these APIs should be within the common minimum API compatible subset. WinterCG hopes to work on similar sorts of test automation to verify that this implemented. |
We need to make a decision here. this extremely fundamental question has been on the table for at least two years. Until the champions take a strong lead in pushing us towards a particular conclusion, this proposal will remain at Stage 2. The start of the discussion above omits a rationale for the organizing principle of omitting I/O, and that needs to be fixed to move forward. My opinion FWIWI really think we should not have I/O within ShadowRealms by default, though some things like timers and console may be OK, and Why I'm not pushing for exposing only JS builtins, nothing more, nothing lessThe rationale raised by many web stakeholders for why to include more than just the JS builtins: Given the lack of a very clean layering between what's defined in JS and what's defined in web APIs, we should work towards exposing a mental model based on a more general principle around capabilities, rather than standards venue. [Otherwise we should work on a project of changing venues of certain existing APIs, which would be messy and a lot of work, and upset existing models of ongoing API maintenance.] I don't think that doing just the ecma262 builtins would correspond to the Wasm model: Wasm doesn't have direct access to JS builtins any more than it has access to Web APIs. They are both omitted. And I am optimistic about Node/Web alignment in the medium term while including these things. Why I would like I/O to be omittedThe use cases of ShadowRealms tend to to be around doing some sort of separation from the surrounding context. This is why we made the callable boundary restriction. I/O obviously is a way through these restrictions. We don't expect an extremely strong, process-like kind of isolation will be possible, but at least we want to make things somewhat separated. Removing I/O (focusing on I/O which can be used to communicate with external pieces of code, which would not include console, timers or hopefully even module loading [with proper use of CSP]) helps us get this separation by default. I/O can be added back in with the exact same mechanism we use for objects Two kinds of ergonomicsShadowRealms generally aren't super ergonomic to set up--you kinda need a membrane for any nontrivial use. We've argued about this repeatedly, and kept coming back to an API that has this property. This is what's needed to communicate to objects, or even get code set up in the first place. And once you have that mechanism, you can use it to expose I/O very easily, to the degree desired by the embedder. By contrast: code running within a ShadowRealm is ordinary JavaScript, fully featured. All the syntax, all the language features. It would make sense if this came with all of the normal computation-related library functionality--this is kind of part of that ergonomics. |
My opinion is aligned with Dan. While I originally wanted network access to enable more use cases without requiring a complex membrane, there is simply too many complexities with exposing an API with such powers and ability to cause effects. I support the postponement of the powerful I/O APIs to a later time where we can figure out the best way to expose them. The reason I suggested the WinterCG common API as a starting point was to answer the requirement expressed by some delegates that the environment in the ShadowRealm be familiar to JavaScript developers, and not require them to understand the boundaries of standards groups. |
This piece from @littledan captures well the rationale behind the list we are proposing. Participants of the ShadowRealms (like in this thread) often have a n+1 different use case, and we are trying to make a a fair room for those. While I don't oppose to I/O, it seems like those would need further complex specs rethinking before inclusion. For timers, an agreement to accept those still seems to be needed. It would be immensely helpful if I could get an objective list of what should be added next to meet a minimum bar (aka MVP) to start Stage 3. I'm afraid my subjectivity might not capture all blockers to move this proposal forward. Thanks! |
I think the no-I/O subset will be a pretty "dense" (large) subset of WinterCG. But it certainly won't include everything that you can count on in WinterCG environments--for example, it would leave out any future socket API. The important thing would be to have a set of organizing principles creating predictable correspondences between environments. I think this set of APIs listed at the top of this thread provides that, by this "No I/O, just computation [except for |
To move to 2.7, I would expect:
To move to Stage 3, I would expect:
|
I wanted to echo some of Dan's points above.
I think moving slower on ShadowRealms is justifiable simply because the host integration is causing some Conway's Law issues, where the TC39 participants aren't as experts in the host integration as they ought to be. (I am attempting to improve this, but have no promises to make). |
By challenges, do you mean the kind of challenges you encountered on a per-API basis as trying to expose them inside ShadowRealms? Chrome is not actively working on the web API side of ShadowRealms until there's clarity on the exact set of APIs to be exposed to ShadowRealm, so we have no feedback to give yet. |
Re: secure context, is there a reason that ShadowRealm doesn't inherit the secure context-ness? |
Okay, I'm very late to the party! This is going to be a lengthy post. The GoalOur objective is to compile a list of APIs that will be accessible within the ShadowRealm. It's important to understand that this list is not final; it will evolve over time as new features are added, mirroring the development of other global objects. To clarify:
This means if there's a feature or opinion you have regarding something not included on the list, we encourage you to open an issue for consideration. It's very likely that it won't be included in the initial iteration, but that's part of the process. ConfidentialityAlthough the ShadowRealm cannot guarantee confidentiality as stated in the explainer (it's marked as partial), our goal is to maintain confidentiality for the APIs exposed inside a ShadowRealm. We acknowledge that side-channel attacks are a potential threat, which means the ShadowRealm should not be viewed as a secure boundary. However, this doesn't mean we're indifferent to protecting information to the best of our ability. And we do so by excluding APIs that could potentially leak information. The Reasoning/PrinciplesFor the first iteration, I'm evaluating each API based on the following questions:
If the answer to both questions is "yes," then the API is considered for the initial batch. If you have an undocumented use case, please open an issue. Similarly, if you believe confidentiality can be maintained in a specific case, let us know through an issue, and we'll examine the details. It's important to note that we, the champions, are not experts in Web APIs, so there may be aspects we're overlooking that could be addressed in future updates. This approach is straightforward, in my opinion. Note: APIs that are not yet standardized, or are in the process of being standardized, are not included at this stage. Noise and MisconceptionsGiven that this proposal has been around for a long time, a lot has changed over the last decade. As a result, there's a lot of outdated information and misconceptions still circulating, which creates confusion. Here are some clarifications:
|
This doesn't make any sense to me, if Shadow Realms don't preserve confidentially, why bother doing it partially? All this partial confidentially does is make the API considerably worse for many use cases. Like "partial confidentially" is not a use case, a test runner is a use case, SES is a use case, code segmentation is a use case. "Partial confidentially" is just a property that needs to be justified with respect to what use cases it is actually for.
This just seems incoherent, Like let's just consider the use cases LISTED IN THE EXPLAINER, it's not clear to me how "partial confidentially" benefits any of them: Trusted Third-Party ScriptsCode TestingCodebase SegmentationDOM VirtualizationMany APIs like Whether specific APIs need to be virtualized is really on a case by case basis, like some situations of codebase segmentation might want to segment SES (Not listed in the explainer)Yes, confidentially is important for SES, but a whitelist is neccessary anyway so partial confidentially is irrelevant here. |
There is a difference between an API that lets you directly read (or even write data) of the main realm, and an API that lets you derive data through side channels. What I understood is that preventing the latter is out of scope. The former is simply postponed to the next iteration.
Module import requires the server to inform a correct content type. Most servers do not expose sensitive information in scripts. I don't remember where the discussions about the module map ended up at, but if the module map is not shared with the main realm, maybe there could be a fetch header set indicating the request came from a shadow realm? I'm not sure this is something that needs to be figured out for the first iteration, or if that could be added later. |
Yes. Precisely this kind of feedback. I am concerned, and remain concerned that this is going to be a problem from the web engine perspective, and would love to have some feedback from other implementers about specifically exposing the host hooks and the challenges and engineering costs is carries. I sort of get the impression that people are under the impression that since WebIDL has the I already have found that it's non-trivial to expose I'm working on trying to get more confidence in our ability to ship this; it will take time however. This distinctly will be a proposal where stage 3 feedback is going to be critical.
Webkit is mentioned elsewhere; is this proving to be trivial there? What's the status in their experimentation? |
a bit of oot, but really? in the content script of a web extension, the global object is a object inherited from the Sandbox interface, not the Window interface. (https://bugzilla.mozilla.org/show_bug.cgi?id=1208775) I wonder if Gecko can do the same thing like what they did to the content script. |
I've been working on a design principle (discussion, PR) which I believe settles this issue. It gives a simpler rationale for which APIs to annotate with I developed this principle based on a number of conversations with implementors and web platform experts. It has been received positively and I believe it has broad support. We now provide a familiar and generally useful JS environment where the distinction between "which standards body standardized a thing" is irrelevant, while also ensuring that ShadowRealm doesn't expose anything that the surrounding realm might not be able to provide. This is the list:
|
That seems great, except: As I've said before, given that browsers have started restricting purely-computational APIs to secure contexts, I strongly disagree with leaving out secure-context-only APIs. It means that, for example, ShadowRealms should support |
Didn't WICG/uuid#23 (comment) mean that it would be available in both, and thus in ShadowRealms? |
Nope, see the thread I linked. Browsers did not in fact implement it in non-secure contexts, despite the TAG recommendation, and the spec writes down what browsers actually do, not what people would like them to do. |
gross, thanks for clarifying. |
While I agree with the sentiment @bakkot, I feel that the fight is somewhere else. Pushing for allowing these APIs to work in insecure contexts seems like the right conversation, and I do believe having this design principle codified, we can use ShadowRealm as yet another argument to get some of those APIs to work in insecure contexts. |
@caridy I've tried multiple times over multiple years. I don't think browsers are going to budge here - certainly not for stuff like And in any case, I don't see why the secure-context restriction should have any relevance to ShadowRealms. "Pure computation" is a nice simple rule. "Pure computation, except not secure-context-only things even in secure contexts" is not a simple rule and doesn't really make any sense. |
Back in September 2023, TC39 agreed to move this proposal to Stage 2 setting a missing requirement for Stage 3 to provide a list of suitable APIs to be exposed to ShadowRealms, along with sufficient tests to ensure correct behaviour in implementations.
Salesforce is currently working with Igalia to organize this work. Igalia produced the following list mapping all the names below:
Web APIs exposed in ShadowRealm
Defined in WebIDL as of whatwg/html#9893.
Checklist indicates which APIs are already covered in WPT.
// META: global=window,worker,shadowrealm
header// META: global=window,worker,shadowrealm
header* = missing from WebKit implementation:
In Progress
Additional names being added within specs PRs and WPT coverage:
Additional rationale regarding Workers
With rationale for not exposing in ShadowRealm:
[SecureContext]
aren't exposed.Date
andTemporal.Now
.Inclusion criteria from specs that are at least Candidate Recommendation, omitting any W3C Working Drafts, API drafts from WICG, etc. Also omitting worker-specific APIs only exposed in workers and not also in
window
.Blob and related APIs (currently W3C Working Draft) could arguably be included but initially we decided not to expose them because they are somewhat tied to fetch and file APIs. The HTML spec references Blob, but only in APIs that are not available in ShadowRealm.
The text was updated successfully, but these errors were encountered: