-
Notifications
You must be signed in to change notification settings - Fork 119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use ci build wheel to build more wheels #76
Conversation
To get the coverage upload on linux to work properly there were some 'hacks' needed since the linux build and tests run inside docker. Those hack are: - temporary renaming kernprof.py so tests only use the installed one - replacing the file paths inside the .coverage files, so they will be valid in the host environment (needed for coverage combine) - copying the .coverage file to the wheel output directory, so they are available on the host
Since the sdist test is manly to ensure that all needed files are included in the sdist, IMHO it is enought to hest it with one python version, since the coplebility test is done by the wheel tests.
Codecov Report
@@ Coverage Diff @@
## master #76 +/- ##
==========================================
+ Coverage 44.63% 52.99% +8.36%
==========================================
Files 5 5
Lines 419 417 -2
Branches 59 59
==========================================
+ Hits 187 221 +34
+ Misses 218 176 -42
- Partials 14 20 +6
Continue to review full report at Codecov.
|
This looks great. It's certainly using github actions better than I currently have the ability to. I spent a long time trying to figure out how to get artifacts to work before giving up, but it looks like you've got them working here. To respond to your questions:
I don't have any problem with this.
That sounds fine. I'm fairly sure this is the case, but I want to double check. Github secrets will only be exposed in events triggered by pyutils members right? For instance, in this PR if you were adversarial and added
I don't understand exactly why this is the case. Are you currently testing in the same environment that is building the wheels? Wouldn't the better thing to do be (1) build the wheel, and upload it as an artifact (2) in a separate environment install the wheel and test it. Testing in the build environment seems like it might not be the best idea. I also have some other comments:
|
As for your questions:
|
Thanks for answering those questions. That gives me a better understanding of how external users can interact with this repo and how cibuildwheel is working. I'm the author of the ubelt library. It's one of my phd babies (and I'm always eager to evangelize it). The function I think I would prefer the trigger to be the push of the release branch. Something that the CI currently does not do, but I would like to have it do, is to automatically create and push the tag when it is deploying. I've done this before, but I think I put the wrong secret or something. No need to worry about it in this PR. Overall, this looks like it is complete. Before I merge, I'm going to take some time to really dig in and try to understand everything, test building the wheels locally, and also test pushing to the pypi test server. Thank you for all your hard work, this has been incredibly helpful! I will merge soon! 🥳 |
Ofc wasn't saying anything against If you want to dig into how As for releases, I personally like the workflow of just using the releases github web UI and have it create the tag for me and the CI/CD takes care of all the rest. I like having all that done in a single interaction (fewer things for me to forget 😅 ). But ofc this is just personal preference. |
The only thing I think I don't understand is what file Everything else looks really good though, and I'll merge as soon as I figure out the answer to the above question. |
In this case, it installs and runs cibuildwheel with the appropriate options and shell ( |
I updated |
Ah, I knew there must have been a file it was linking to, I just couldn't figure out what it was. I must have just kept glancing over action.yml when I was looking, but now that makes sense to me. The dependabot schedule seems reasonable. If you want to add checks for Python packages in another PR, I'll accept it. I think it's a nice-to-have, but maybe not strictly necessary. Anyway, thank you again for all this hard work. I'm going to merge this, and attempt to publish shiny new wheels. |
This PR implements the github action from joerick/cibuildwheel to build and test more wheels (additional wheels are Windows, MacOs, and aarch64) .
It also unifies the workflows
python-publish.yml
,python-sdist-test.yml
andpython-test.yml
intotests.yml
.Having a single workflow to run all tests allows jobs to require other jobs to pass before running, which can save CI time.
For example, in this workflow the tests only run if the lining passes and deployment only runs if all tests pass.
Another restriction for deployment to run is that the triggering event has to be the push of a
tag
(see example run on my forks master branch).Built files
See last CI run before reactivating branch restriction.
Talking about branch restrictions, how about allowing the tests to run on all push events?
As a contributor, I like to know that the CI passes before I bother someone with a PR that might fail.
With restricting the branches that the CI runs on I only have two options.
Dropping the restrictions would make it easier for contributors to check that all is fine before making a PR.
Since the linux wheels are built inside of a docker image and paths don't match up with the paths on the host, uploading the coverage is quite hacky.
But it works and maybe someone has a better solution (see)
Building the wheels for Linux could ofc also be done as it was before without the hacks for the coverage, but IMHO saving the mental capacity of thinking about and keeping up with changing build requirements (leaving that to a widely used and specialized project) is worth it.
Added bonus, since
kernprof.py
isn't spatially tested from source and installation anymore, the coverage went up 8% 😄No emojis this time, sorry for that again 😞 .
closes #63