Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci: Build linux aarch64 wheels on release #315

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

jthetzel
Copy link

Testing merge conflicts from #289 .

@jakirkham
Copy link
Member

jakirkham commented Apr 12, 2022

Thanks for exploring this Jeremy! 😄

Generally supportive of helping more users on more architectures. Though it is worth noting we are struggling with CI burden ( #312 ). Would suggest that we start trimming what runs on every PR in favor of running some things only on main or even only on release.

@jakirkham
Copy link
Member

cc @joshmoore

@jthetzel
Copy link
Author

Wheels / Build wheel on auto for macos-latest (pull_request) complaining with

ERROR: No matching distribution found for cibuildwheel==1.8.0


- name: Install cibuildwheel
run: |
python -m pip install cibuildwheel==1.8.0 wheel
Copy link
Member

@jakirkham jakirkham Apr 12, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need this pinning or could this be dropped?

Note: We are using the cibuildwheel action above. Maybe there is some way to consolidate this work into that step (possibly through additional configuration options)?

@rabernat
Copy link
Contributor

It would be great to get these wheels out. @jthetzel and @jakirkham - in your opinion, what's needed to move forward?

@jakirkham
Copy link
Member

As noted above ( #315 (comment) ), am concerned about CI usage. Currently we run wheel builds on every PR regardless of what the PR changes. These builds take quite a while to run. We've cutdown the number of builds done in wheel jobs to help mitigate this. However adding new architectures that require emulation (like ARM) will make these wheel jobs take longer. So think we need to come up with a way to only selectively enable these jobs. Please see issue ( #312 ) for more details.

Separate from that CI here is failing. Don't have time to look into this unfortunately. So cannot advise on that atm.

@keflavich
Copy link

I have the same issue, and while I can use conda for the install to use the code, it prevents me from testing with tox, since my tox environment is set up to use pip. I'm not sure how to read this - have you guys decided not to build arm64 wheels because of the CI burden? Or are you just going to limit the versions that you build wheels for?

@jakirkham
Copy link
Member

I don't think we are against building the wheels. Just as this PR is configured currently, it builds them on every PR, which adds up to a lot of CI usage and negatively impacts the contributor experience. If we could do this only when there is a release and/or only when there are changes to build related files (setup.py, pyproject.toml, etc.), that would help address this concern.

@jthetzel
Copy link
Author

jthetzel commented Jul 9, 2023

The continuous integration is updated to only build aarch64 wheels when a release is created (when a tag matching "v*" is created).

@jthetzel
Copy link
Author

jthetzel commented Jul 9, 2023

Building aarch64 wheels for python 3.8, 3.9, 3.10, 3.11 takes about 1.5 hours ( https://github.com/jthetzel/numcodecs/actions/runs/5500370791 ).

@jthetzel
Copy link
Author

jthetzel commented Jul 9, 2023

@jakirkham @rabernat I completely forgot about this pull request until I went to build a container on a jetson nano with a zarr dependency this morning, googled my way here, and was shocked that I contributed to it over a year ago (my daughter, Daphne, was born five days later).

Having a separate aarch64 wheel build action that only triggers when a release is deployed feels clean to me. Until aarch64 build vms are widely available, I'm not aware of other options, but intereseted in further discussion.

@jthetzel jthetzel changed the title ci: Add linux aarch64 wheel build support from @Odidev ci: Build linux aarch64 wheels on release Jul 9, 2023
@jthetzel
Copy link
Author

@joshmoore Bumping this just to check if more work is needed or a different path preferred.

@joshmoore
Copy link
Member

Thanks for the ping, @jthetzel. I've relaunched the builds. I'm on the road though and so will be a bit slow on the uptake until next week. cc: @jakirkham

Did you happen to see #428 for a related / alternative proposal?

@dcherian
Copy link

dcherian commented Mar 1, 2024

Can this be merged? It'd be nice to have aarch64 wheels

@dstansby
Copy link
Contributor

Instead of adding a whole new GitHub actions workflow, I think it's easier to add support for linux aarch64 to the existing wheel workflow: https://github.com/zarr-developers/numcodecs/blob/main/.github/workflows/wheel.yaml

@joshmoore
Copy link
Member

@jthetzel: are you interested in giving that a try?

@jthetzel
Copy link
Author

Certainly. I'll update the branch an re-read the discussion.

@jakirkham
Copy link
Member

The main reason we want to split them out is CI time blows up if we have them all in one job

Perhaps another approach is we only build these on main or perhaps at release time

@dstansby
Copy link
Contributor

The main reason we want to split them out is CI time blows up if we have them all in one job

Do you mean split them out into multiple GitHub actions jobs that run in parallel? If so the current approach to having a matrix across operating systems achieves this in a single configuration file, which produces parallel workflows for each OS (e.g. see https://github.com/zarr-developers/numcodecs/actions/runs/10338358292).

So I think to minimise duplicating code/GH actions config, the way to go is to update the existing configuration file instead of adding a new one?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants