-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dynamically download large .node binaries #6483
Comments
One problem is that many people disable npm postinstall scripts for security reasons. I was thinking of possibly doing this in reverse. Rather than downloading on demand, which would only work if post install scripts are enabled, we could still bundle all of the prebuilds into the package and delete the unused ones in a post install script. This means that if you turn off post install scripts, it would at least still work, but if you didn't, you'd save some disk space. I'm less worried about the bandwidth because the tar file downloaded from npm is pretty small - the binaries seem to compress very well. |
The download size might not be an issue for those with a stable connection, but for the repositories that rely on open-source contributions, this can be a deciding factor. The download size is 45MB, which is very high. The disk space is 133MB and not an issue for those that use PNNPM. An already existing example of this issue is the number of dependants of Lefthook vs Husky. Only 3 packages depend on Lefthook while 2000 depend on Husky! https://www.npmjs.com/package/@arkweid/lefthook |
Then I'm not sure how to solve this in a way that also works for people who disable post install scripts for security. Unfortunately, the npm registry doesn't have a way to upload multiple versions that get chosen between automatically. Also, this is interesting – according to the prebuilt-install package you linked:
|
Is there a documented statistics on the number of people who disable postinstall? All the native packages rely on postinstall. Even prebuildify relies on that to check if the prebuilt binaries are compatible with the host.
I already mentioned this in the OP. If the binaries are not that large, this argument is true, and it is better to bundle the binaries. But when each of them is big and you are supporting all the architectures and operating systems, it is clear that downloading 15MB takes less than downloading and uncompressing 45MB. |
(Installing Parcel without postinstall scripts would only work for macOS x64 and Windows x64 right now, because |
In fact, I also don't understand how someone trusts the runtime of a dependency but doesn't trust its post-install. There is no difference here... |
🐛 bug report
Currently, the
.node
files are all bundled in transformer-js, which makes the download size of the package as big as 45MB. This is because the binaries for all platforms are included in the package. This could be a simpler approach when the size of each of the binaries wasn't high, but for transformer js, this is not the case.🤔 Expected Behavior
Reduce the download size
😯 Current Behavior
45MB of download size for transformer-js
💁 Possible Solution
Use a dynamic method for downloading.
Here is a manual approach by writing a custom script:
https://github.com/evilmartians/lefthook/blob/master/.npm/install.js
An alternative is to use a ready package like prebuild-install
🔦 Context
💻 Code Sample
🌍 Your Environment
The text was updated successfully, but these errors were encountered: