-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Publish pre-built bundles to npm #7398
Conversation
I'm loving how this is looking, great work! I love the idea of a smaller and faster package, not to mention that would remove the need for additional dependencies, or worry about the NodeJS version at all. I spent some time reviewing this code, and I've just got a few suggestions:
|
@ddbeck, what do you think? |
@vinyldarkscratch Thanks for your comments!
Ok, will do.
As is, I use
Are you suggesting that to reduce download size or at-rest size in As far as I know, npm already gzips packages in transit (and uncompresses them automatically). According to the a log of a dry Storing package in compressed format in Edit: grammar. |
I think
I'm referring to the at-rest size when the package is downloaded and installed. |
@vinyldarkscratch I added GZip, moved file copy code and changed around a few things. Most notably, now bundling script uses async I/O for file copying and checks for errors. Also, bundle now requires (and was tested) on NodeJS 4.2.0 released in 2015-- the earliest LTS release in 4.* branch, as far as I see. |
Is there anything that I should do to advance this PR? |
I plan to review this (and the original issue) more thoroughly, but it may take some time (with some time off at the end of this week). My very quick impression here:
|
Oh one more thought: I would also like to look at some of the existing tooling for building distributables (e.g., webpack) and make sure we're not reinventing something. |
@ddbeck I'll split out the compression part to simplify review.
The current script assumes that the current system ( We could test the following:
Compression is not essential (I actually excluded it initially to simplify review). I would be glad to defer compression for later.
I don't think anyone bothered to write a tool for a use case as simple as this one. Most bundling tools do a lot of things like dependency resolution, tree shaking, transpiring, etc. In our case, none of that is needed. |
I re-wrote the PR from scratch and made everything much simpler. Regarding earlier comments by @ddbeck :
I can create the following process with an end-to-end test (that could be run as an Action or locally):
Unfortunately, only paying npm users can push private releases.
I experimented with lifecycle scripts in #7513 and found them too inflexible. I found two main inconveniences:
|
@bershanskiy Thank you for the updates on this. I'm sorry its taken me so long to come back to this. I do plan to return to this, but I'm reluctant to merge a change like this close the launch of Yari (our most visible dependent). Watch this space. 😄 |
@ddbeck Of course, take your time! Bug-free launch is more exciting than a rushed feature. :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very excited about this direction! I tried it locally and made a bunch of suggestions.
I added a Mocha test for this in |
@vinyldarkscratch A while ago you wrote:
Please note that now |
await fs | ||
.rmdir(directory, { | ||
force: true, | ||
recursive: true, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
By switching Node versions (with nodenv
), I discovered that the recursive
option doesn't work on Node 10. But it's EOL in ~2 weeks. I think we can tolerate the (unlikely) annoyance for that long.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, now that I think on it some more: I guess it doesn't matter at all. If all we publish is a .json
file, then we can adopt whatever version of Node we want for development.
(Though this also suggests that, should we desire to expose utilities for working with compat data, it would need to be as a separate package.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
By switching Node versions (with nodenv), I discovered that the recursive option doesn't work on Node 10.
Yes, the docs say recursive
was introduced in 12.
If all we publish is a .json file, then we can adopt whatever version of Node we want for development.
I'll confess: it was my secret plan all along :) . More recent NodeJS versions have nice features, so I wanted to simplify transition to modern NodeJS should you ever choose to.
(Though this also suggests that, should we desire to expose utilities for working with compat data, it would need to be as a separate package.)
Does BCD expose any utilities now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll confess: it was my secret plan all along :)
😄 Yeah, there's a ton of hidden benefits to this. For example, we could add internal-only fields to the schema that might make working with the data more ergonomic, but filter it out before building the final data for consumers.
Does BCD expose any utilities now?
No, not yet. It's something we've maybe inched toward (see #9441), but have been sort of cautious about it. I rather like the idea of decoupling the data from such utilities anyway and this PR would sort of force that decision.
Since this test is slow, perhaps we can put mocha after the linting, for the convenience of routine edits?
We can probably do something to improve the performance of |
Done!
Yes, However, tests temselves can be improved (and made more async):
|
@bershanskiy I think you're on the right track with the performance. Right now, we do I/O ( I did an experiment to first create promises for all the file reads, to read everything in parallel. That can being down the (uncached) time by perhaps half a second, but doesn't do much when cached. I think that loading all files in parallel isn't the optimum though, probably one could do better by reading one at a time but never stopping for |
Here's the async variant of async function load(...dirs) {
const result = {};
const stack = dirs.map(dir => path.resolve(__dirname, dir));
const promises = [];
while (stack.length) {
const dir = stack.pop();
const entries = await fs.readdir(dir, {withFileTypes: true});
for (const ent of entries) {
const entpath = path.join(dir, ent.name);
if (ent.isDirectory()) {
stack.push(entpath);
} else if (path.extname(entpath) === '.json') {
promises.push(fs.readFile(entpath).then(JSON.parse));
}
}
}
for (const p of promises) {
const data = await p;
extend(result, data);
}
return result;
} A remaining problem is that it changes the order of the keys in the objects, which isn't even going to be deterministic on all platforms unless the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good! Leaving final review for @ddbeck though.
OK, I've read back through the thread and I have one question remaining. If I install this package from source (i.e., (Compare this to #7513, where running Is that weird or surprising? This is an honest question. I'm legitimately unsure how significant that is. |
@ddbeck I'm no expert maintainer of npm packages, but a "installing from source should equal dist" rule doesn't seem like anyone can safely assume. Take https://github.com/w3c/webref which is the source of two packages, there isn't any URL at all one could install to get the equivalent of the So this seems fine to me. Using lifecycle scripts might still be a good idea, however. The tradeoff is that any build step will require contributors to run the build script manually after any change, or risk being confused when their changes don't seem to work. Lots of packages work like that, so it's not unworkable, but does seem avoidable in this case. |
@foolip having an existing, real-world example to consider is really helpful. This works for me. Thank you! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I'm going to merge this now. Outstanding and thank you for putting this together and for your patience, @bershanskiy! And thank you @foolip, for giving this a really excellent review too.
I've written a rather lengthy release note for this PR in #9865. If either of you has a chance to take a look at that before today's release, that would be especially welcome (though strictly optional).
This is a simple implementation of publishing "bundled" releases to npm, which are much smaller and load faster than the current releases. Background: #7374 .
Benchmark (updated)
Release contents
scripts
, which describes script files already omitted from npm releases)A checklist to help your pull request get merged faster: