-
Notifications
You must be signed in to change notification settings - Fork 88
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ci: Build linux aarch64 wheels on release #315
base: main
Are you sure you want to change the base?
Conversation
Signed-off-by: odidev <odidev@puresoftware.com>
Thanks for exploring this Jeremy! 😄 Generally supportive of helping more users on more architectures. Though it is worth noting we are struggling with CI burden ( #312 ). Would suggest that we start trimming what runs on every PR in favor of running some things only on |
cc @joshmoore |
|
.github/workflows/wheel.yaml
Outdated
|
||
- name: Install cibuildwheel | ||
run: | | ||
python -m pip install cibuildwheel==1.8.0 wheel |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we need this pinning or could this be dropped?
Note: We are using the cibuildwheel
action above. Maybe there is some way to consolidate this work into that step (possibly through additional configuration options)?
It would be great to get these wheels out. @jthetzel and @jakirkham - in your opinion, what's needed to move forward? |
As noted above ( #315 (comment) ), am concerned about CI usage. Currently we run wheel builds on every PR regardless of what the PR changes. These builds take quite a while to run. We've cutdown the number of builds done in wheel jobs to help mitigate this. However adding new architectures that require emulation (like ARM) will make these wheel jobs take longer. So think we need to come up with a way to only selectively enable these jobs. Please see issue ( #312 ) for more details. Separate from that CI here is failing. Don't have time to look into this unfortunately. So cannot advise on that atm. |
I have the same issue, and while I can use conda for the install to use the code, it prevents me from testing with |
I don't think we are against building the wheels. Just as this PR is configured currently, it builds them on every PR, which adds up to a lot of CI usage and negatively impacts the contributor experience. If we could do this only when there is a release and/or only when there are changes to build related files ( |
The continuous integration is updated to only build aarch64 wheels when a release is created (when a tag matching "v*" is created). |
Building aarch64 wheels for python 3.8, 3.9, 3.10, 3.11 takes about 1.5 hours ( https://github.com/jthetzel/numcodecs/actions/runs/5500370791 ). |
@jakirkham @rabernat I completely forgot about this pull request until I went to build a container on a jetson nano with a zarr dependency this morning, googled my way here, and was shocked that I contributed to it over a year ago (my daughter, Daphne, was born five days later). Having a separate aarch64 wheel build action that only triggers when a release is deployed feels clean to me. Until aarch64 build vms are widely available, I'm not aware of other options, but intereseted in further discussion. |
@joshmoore Bumping this just to check if more work is needed or a different path preferred. |
Thanks for the ping, @jthetzel. I've relaunched the builds. I'm on the road though and so will be a bit slow on the uptake until next week. cc: @jakirkham Did you happen to see #428 for a related / alternative proposal? |
Can this be merged? It'd be nice to have |
Instead of adding a whole new GitHub actions workflow, I think it's easier to add support for linux aarch64 to the existing wheel workflow: https://github.com/zarr-developers/numcodecs/blob/main/.github/workflows/wheel.yaml |
@jthetzel: are you interested in giving that a try? |
Certainly. I'll update the branch an re-read the discussion. |
The main reason we want to split them out is CI time blows up if we have them all in one job Perhaps another approach is we only build these on |
Do you mean split them out into multiple GitHub actions jobs that run in parallel? If so the current approach to having a matrix across operating systems achieves this in a single configuration file, which produces parallel workflows for each OS (e.g. see https://github.com/zarr-developers/numcodecs/actions/runs/10338358292). So I think to minimise duplicating code/GH actions config, the way to go is to update the existing configuration file instead of adding a new one? |
Testing merge conflicts from #289 .