-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WIP: Demonstrating Native Addons #38
Conversation
👆 Click on the image for a new way to code review
Legend |
In the Node-API docs:
So NAPI is actually a C-based API. Whereas That means leveldown and classic-level is just using C, and not C++. I'm guessing if we wanted to use C++, we would actually have to also bring in I wonder then why it's called |
Using
Until this is figured out arrterian/nix-env-selector#68, this is a project-by-project things. |
Some additional findings:
The relevant lifecycle script is A side note about A problem right now is that upon running our 2 So the result is to add 3 commands, the first 2 does |
Therefore if we only want to recompile our own C++ addon, then we must run Now this leaves a I may then link up this rebuilding call to the |
GTherefore if we only want to recompile our own C++ addon, then we must run Now this leaves a I may then link up this rebuilding call to the |
Because the Now |
The tests work and also running the executable works. Now to see how |
The changes here are likely to be quite significant, so it's going to be worthwhile to keep this as a separate top level branch as |
In order to ensure that our CI/CD during testing is using the same I originally thought I could just bring in |
Something I've noticed that that deep strict equality doesn't work on arrays and objects returned from the native module. I'm not sure why, but even But otherwise, using So there must be some difference between objects created in JS land, or objects created in n-api, and returned, that prevents deep strict equality from working. This affects jest testing and any usage of Raised issue on jest: jestjs/jest#12814 |
For some quick debugging we can use: for(size_t i = 0; i < bufferSize; i++) {
printf("%#x ", buffer[i]);
} But I'm still getting refamiliarised with C/C++ code here... and I suspect a better debugging experience can be had by using a debugger in the future. Now during jest testing, the Furthermore, programming at NAPI level is really quite low level. The NAPI macros are not really comprehensive, there's quite a few areas where it is lacking. It's a C focused API. So if we really doing hardcore C++, it's much better to use https://github.com/nodejs/node-addon-api. The alternative is rust. |
We now have 5 example NAPI functions:
I think this should allow us to proceed to the next stage. |
The TS interface is added. All we needed to do is define the interface in |
Now on task 3.
We need to first integrate our The goal is so that I wonder that since the resulting |
Search through the current Also take note that the
We can see here that the compiled executables are all called But they are also in architecture-specific directories. In particular In one particular case, we also have This makes Furthermore in This also reveals to us that |
For leveldown it looks like this:
Alot more android flavours are available. And even a choice between glibc and musl. And for UTP:
|
Ok the next step is to introduce Interesting discussion here svanderburg/node2nix#187 with the ability to slim the resulting docker image which probably contain nodejs and many other build time dependencies. |
Ok so |
So basically run this:
The default
And these along with potentially other things will just be executed within the CI/CD to produce all the relevant There's probably a structure expected to be uploaded to the release page. Furthermore it appears that the However it is expected that the
Whereas for me, I only have:
So the question is, how do we get all the other compiled binaries, if we are developing on Linux, and we want to publish our package? The realisation here is that, you don't actually need to publish the binaries to GitHub releases. The usage of |
It appears that it is expected that the developer just goes and fetches all the prebuild binaries for all the platforms from a CI system, and then puts it all together manually into the
Crucial information is located here: https://github.com/prebuild/prebuild-install as prebuildify is an evolution of This basically means after the CI/CD has built everything and pushed it to the release page, you then download them and put them together into the This seems rather clunky, I'd prefer that we have a single command to assemble the If we are going to do this, this means our CI/CD has to run on all the different platforms the prebuildify commands. Meaning that PRs which contain new features will result in new compilations on each platform. It also means |
The CI/CD workflow used by prebuildify ecosystem is ultimately explained here: https://github.com/mafintosh/prebuildify-ci Basically the proposed flow is:
Right now we cannot use This workflow does have some problems:
The solution to 1. is to make use of the CI/CD features more, such that The solution to 2. involves these new developments:
Basically the idea is that each platform build goes into their own NPM package, and There's an overall problem with distributing binaries. The lack of determinisim/reproducibility and the supply chain verification. These binaries are just put into the npm package without any oversight, and no source code verification can be done. It does seem that the existence of binaries is only for convenience, and a good build should be done from source. |
Getting CI/CD to do everything will require the more sophisticated release pipeline that was first specced out in our strategy letter 3 https://gitlab.com/MatrixAI/Employees/matrix-team/-/issues/8#note_885403611. This might be complicated if we are going to maintain 2 branches for |
2f69c75
to
ea62a14
Compare
I'm going to push this to https://github.com/MatrixAI/TypeScript-Demo-Lib-Native later at the end, and then submit request to https://gitlab.com/gitlab-com/macos-buildcloud-runners-beta/-/issues |
For arm64 binary on macos:
The |
The final integration runs are failing. I suppose because when using |
I noticed that the windows integration run command doesn't fail even if when the command fails. I guess the
Either that or See: https://gitlab.com/MatrixAI/open-source/typescript-demo-lib/-/jobs/2468682176 |
The integration runs are still failing. I need to debug this locally, by seeing why our bundled executable isn't running. Right now both windows and macos takes quite a bit a time to startup, we could aid this by changing the cache:
This might be enough to proceed. Nix caching is even more difficult since fundamentally it's a network cache, and we aren't able to get gitlab to cache |
It turns out
Cause if xcode wasn't installed we would end up with this problem anyway. The automatic ways of install of xcode is actually quite complicated https://apple.stackexchange.com/questions/107307/how-can-i-install-the-command-line-tools-completely-from-the-command-line (and very flaky, none of it is intended to be automatic), so for now we will simply expect that the gitlab runner already has xcode installed and we don't need to install them ourselves. I found this out by trying to install the new brew on the newly update |
The caching didn't really save that much time for both windows and macos. In fact, I don't think choco is caching anything. It seems to redownload each time. See discussion: chocolatey/choco#2698 and chocolatey/choco#2134 Therefore I'm removing these: # Chocolatey cache is only used by the windows runner
- ./tmp/choco and - choco config set cacheLocation "${env:CI_PROJECT_DIR}\tmp\choco" Homebrew marginally improves perf by using the cache, but it's not significant. |
The reason for this is because
|
I've requested it to be packaged at top level here: NixOS/nixpkgs#173467. But there's a fork of |
Apple now requires mandatory code signing and mandatory notarization for all executables, application bundles, packages. This applies all iOS and MacOS devices. Any program attempting execution without a legitimate signature is automatically closed. In a nutshell these are our steps:
Setting up the Developer ID Application CertificateIn order to do code-signing on our CLI executable, we must acquire an X.509 certificate. It is called "Developer ID Application" certificate. This certificate is necessary to distribute applications outside of the Mac app store. This is because we are distributing this CLI executable directly out of GitHub release page. Apple has other certificates intended for other distribution purposes. Each certificate requires its own keypair. You cannot use the same keypair for different certificates. After using the certificate, the application must also be notarized with Apple which tells Apple what kind of capabilities the executable requires. To acquire this certificate, we must have an account on developer.apple.com. We had already set this up back in 2020. Although the X.509 certificate system can be done entirely in Launch the By default there is already a Click This will open up a panel where you can fill in:
The Select The keypair must be RSA 2048 dictated by Apple, this is by default, so we don't need to change anything for the keypair generation. Once this is done, the keypair will appear in your The keypair in your Then go https://developer.apple.com/account/resources/certificates/add and select On the next page, select the latest The certificate will then be generated, and you can download it to your computer. Now you need to import this into your security import ~/Downloads/developerID_application.cer Once this is done, you will be able to check whether the certificate was properly imported: security find-identity It should show the If you selected the latest profile type, it is possible that the Mac has an outdated
Note that if you have the Xcode GUI application, you can setup your certificates directly there. This involves going to Only a maximum of 5 developer ID certificates can be created. To revoke any of them it requires an support email to Apple. Go to https://developer.apple.com/contact/topic/select. It requires you be signed in as an Apple developer. Additionally you can use these command line commands to manipulate the keypairs: # Extract private key as encrypted PEM from p12 (it will now have a password)
openssl pkcs12 \
-nocerts \
-in './MATRIX AI PTY. LTD. - Developer ID Application.p12' \
-out './MATRIX AI PTY. LTD. - Developer ID Application'
# Extract private key as unencrypted PEM from p12
openssl pkcs12 \
-nocerts \
-nodes \
-in './MATRIX AI PTY. LTD. - Developer ID Application.p12' \
-out './MATRIX AI PTY. LTD. - Developer ID Application'
# Generate public key from private key
openssl rsa \
-pubout \
-in './MATRIX AI PTY. LTD. - Developer ID Application' \
-out './MATRIX AI PTY. LTD. - Developer ID Application.pub' Setting up Application Specific Password for
|
Everytime xcode is installed or updated, do this: sudo xcode-select --reset Otherwise the xcode command line tools might be out of date. Automating Code Signing and Notarization in the CI/CDThe First we must export our keypair and certificate as This is TBD. The relevant resources to continue down this path are:
Basically we need to load our developed ID key into the macos runner, and then perform the above tasks unattended to be able to do code signing and notarisation automatically. Actually it appears you don't need the main keychain password to be able to do this. Let's see. |
…ot cause a full rebuild of the derivatons
Since I've confirmed that the |
All relevant issues have been created in this repo and typescript-demo-lib-native. Remaining development will continue on MatrixAI/TypeScript-Demo-Lib-Native#2. We will be requesting access to macos for that repository too. |
We will have some changes brought in here after first merging in MatrixAI/TypeScript-Demo-Lib-Native#2 |
Description
In helping solve the snapshot isolation problem in MatrixAI/js-db#18, we needed to lift the hood and go into the C++ level of nodejs.
To do this, I need to have a demonstration of how native addons can be done in our demo lib here.
There are 2 ecosystems for building native addons:
Of the 2, the prebuild ecosystem is used by UTP and leveldb. So we will continue using that. Advantages from 2016 was commented here: prebuild/prebuild#159
The basic idea is that Node supports a "NAPI" system that enables node applications to call into C++. So it's a the FFI system of NodeJS. It's also a bidirectional FFI as C++ code can call back into the NodeJS JS functions.
The core library is
node-gyp
. In the prebuild ecosystem is wrapped withnode-gyp-build
, which you'll notice is the one that we already using in this repo. The main feature here is the ability to supply prebuilt binaries instead of expecting the end-user to always compile from source.Further details here: https://nodejs.github.io/node-addon-examples/build-tools/prebuild (it also compares it to node-pre-gyp).
The
node-gyp-build
has to be adependency
, notdevDependencies
, because it is used during runtime to automatically find the built shared-object/dynamic library and to load it.It looks like this:
Internally
nodeGypBuild
ends up calling therequire()
function inside NodeJS. Which supports the ability to load*.node
binaries (which is the shared-object that is compiled using the NAPI C++ headers). See: https://github.com/prebuild/node-gyp-build/blob/2e982977240368f8baed3975a0f3b048999af40e/index.js#L6The
require
is supplied by the NodeJS runtime. If you execute the JS with a different runtime, they may support the commonjs standard, and thus understand therequire
calls, but they may be compatible with native modules that are compiled with NAPI headers. This is relevant since, you also have to load the binary that matches your OS libraries and CPU architecture. It's all dynamic linking under the hood. This is also why you usenode-gyp-build
which automates some of this lookup procedure.As a side-note about bundlers. Bundlers are often used part of the build process that targets web-platforms. Since the web platform does not understand
require
calls, bundlers will perform some sort of transclusion. This is also the case when ES6import
targets files on disk. Details on this process is here: https://github.com/evanw/esbuild/blob/master/docs/architecture.md#notes-about-linking. Bundlers will often call this "linking", and when targetting web-platforms, this is basically a form of static linking since JS running in browsers cannot load JS files from disk. This is also why in some cases, one should replace native addons with WASM instead, as bundlers can support static linking of WASM (which are cross-platform) into a web-bundle. But some native addons depend on OS features (like databases with persistence), and fundamentally cannot be converted into WASM binaries. In the future, our crypto code would make sense to turn into WASM binaries. But DB code is likely to always be native, as they have to be persistent. As the web develops can gains extra features, then eventually it may be possible that all native code can be done via WASM (but this may be a few years off).Now the native module itself is just done with a C++ file like
index.cpp
. We should prefer using.cpp
and.h
as the most portable extensions.Additionally, there must be
binding.gyp
file that looks like this:Basically another configuration file that configures
node-gyp
and how it should be compiling the C++ code. Thetarget_name
specifies the name of the addon file, so the output result will besomename.node
. Thesources
are self-explanatory. Theinclude_dirs
entries have the ability to execute shell commands, in this case, it is usingnode -e
to execute a script that will return some string that is a path to C++ headers that will be included during compilation.The C++ code needs to use the NAPI headers, however there's a macro library that makes writing NAPI addons easier: https://github.com/hyperdivision/napi-macros. I've seen this used in the utp-native and classic-level.
The C++ code may look like this:
This ends up exporting a native module containing the
times_two
function that multiples a number by 2, and returns anint32
number.It's also important that
node-gyp-build
is setup as ainstall
script in thepackage.json
:This means when you run
npm install
(which is used to install all the dependencies for a NPM package, or to install a specific NPM package), it will run thenode-gyp-build
durin the installation process.This means that currently in our
utils.nix
node2nixDev
expression still requires thenpm install
command. This used to exist, however I removed it during #37 thinking it had no effect. But it was confirmed by svanderburg/node2nix#293 (comment) that thenpm install
command is still run in order to execute build scripts. Andnode-gyp-build
is now part of the installation process. We should include: https://github.com/svanderburg/node2nix/blob/8264147f506dd2964f7ae615dea65bd13c73c0d0/nix/node-env.nix#L380-L387 with all the necessary flags and parameters too. We may be able to make it work if we hook our build command prior tonpm install
. I imagine that this should be possible since thenpm rebuild
command is executed prior. So we need to investigate this.In order to make this all work, our Nix environment is going to need all the tools for source compilation. Now according to https://github.com/nodejs/node-gyp#on-unix we will need
python3
,make
andgcc
. Ourshell.nix
naturally hasmake
andgcc
because we are usingpkgs.mkShell
which must extend fromstdenv.mkDerivation
. Howeverpython3
will be needed as well.The
node2nix
has some understanding of native dependencies (this is why it also brings inpython
in its generated derivation svanderburg/node2nix#281), and I believe it doesn't actually build from source (except in some overridden dependencies).Some npm dependencies are brought in via nixpkgs
nodePackages
becausenode2nix
derivation isn't enough to build them (because they have complex native dependencies). Such asnode-gyp-build
itself or vercel'spkg
. This is also why I had to providenodePackages.node-gyp-build
in ourbuildInputs
overrides inutils.nix
. It is important that any dependencies acquired via nixpkgs must be the same version we use in ourpackage.json
. And this is the case for:Ideally we won't need to do this our own native packages if
js-db
ends up forkingclassic-level
orleveldown
. I think this trick is only relevant in our "build tools" and not our runtime dependencies.The remaining problem is cross-compilation, as this only enables building from source if you are on NixOS and/or using Nix. Windows and MacOS will require their own setup. Since our development environment is all Nix focused, we don't have to worry about those, but for end-users who may want to rebuild from scratch, they will need to setup their development environent based on information in https://github.com/nodejs/node-gyp. A more pressing question is how we in our Nix development environment will be capable of cross-platform native addons for distribution.
This is where the prebuild ecosystem comes in and in particular https://github.com/prebuild/prebuildify-cross. This is used in leveldb to enable them to build for different platforms, and then save these cross-compiled objects. These objects are then hosted on GitHub releases, and automatically downloaded upon installation for downstream users. In the case they are not downloadable, they are then built from source. https://github.com/Level/classic-level/blob/f4cabe9e6532a876f6b6c2412a94e8c10dc5641a/package.json#L21-L26
However in our Nix based environment, I wonder if we can avoid using docker to do cross compilation, and instead use Nix to provide all the tooling to do cross-compilation. We'll see how this plays out eventually.
Some additional convenience commands now:
Issues Fixed
nodejs.src
for--nodedir
when it can just use thenodejs
svanderburg/node2nix#295mkShell
should setNIX_NO_SELF_RPATH = true;
by default NixOS/nixpkgs#173025Tasks
node-gyp-build
addOne
for primitives andsetProperty
for reference-passing procedure andmakeArray
for heap allocationnix
expressions to supportnode-gyp-build
and other build scripts, and see if we can eliminate ourpostInstall
hook, by relying onpackage.json
hooks insteadprebuildify
to precompile binaries and host them on our git release... but this depends on whethertypescript-demo-lib
is used as a library or as an application, if used as an application, then thepkg
builds is used, if used as a library, then one must install the native binary from the same github release, this means the native binary must be part of the same release page.pkg
integration may just be a matter of setting theassets
path inpackage.json
to the localprebuilds
directory.[ ] 5. Cross compilation,- we must use CI/CD to do cross compilation (not sure about other architectures like ARM)prebuildify-cross
or something else that uses Nix@typescript-eslint
packages to match js-db to avoid the warning message.[ ] 8. Update README.md to indicate the 2 branches of typescript-demo-lib, the main and the native branch, where the native branch indicates how to build native addons- this will be done in a separate repo: https://github.com/MatrixAI/TypeScript-Demo-Lib-Native based off https://gitlab.com/MatrixAI/Employees/matrix-team/-/issues/8#note_885403611pkg
bundle can receive optimisation on which prebuild architectures it bundles, right now it bundles all architectures, when the target architecture implies only a single architecture is required. This can slim the final outputpkg
so it's not storing random unnecessary things. This may mean thatpkg
requires dynamic--config
to be generated.nix-build ./release.nix -A application
can be useprebuilds/
directory as well, as this can unify withpkg
. That way all things can useprebuilds/
directory. But we would want to optimise it with task 10.[ ] 12. Ensure that- this can be done in polykey as a scriptnpm test
can automatically run general tests, and platform-specific tests if detected on the relevant platformFuture Tasks
win-arm64
,linux-arm64
(linux will require the necessary nix-shell environment) - Cross-Compilation for linux-arm64 and win-arm64 TypeScript-Demo-Lib-Native#3ldid
orcodesign
WIP: Demonstrating Native Addons #38 (comment)pkg
bundling script so that it doesn't bundle useless.md
files, right now it's even bundling theCHANGELOG.md
files WIP: Demonstrating Native Addons #38 (comment)pkg
instead ofzip
archives so you can do stapling and therefore not require the client systems to have access to the internet before running the executable: WIP: Demonstrating Native Addons #38 (comment) - Automate MacOS Code-Signing in CICD Polykey-CLI#253integration:macos
job - WIP: Demonstrating Native Addons #38 (comment) - Automate MacOS Code-Signing in CICD Polykey-CLI#253npm test
(it should automatically understand how to conditionally test these things by loading files appropriately in the right platform, or just a script that knows): https://stackoverflow.com/questions/50171932/run-jest-test-suites-in-groups - Platform-specific grouping of tests fornpm test
#41Final checklist