Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Join as maintainer to take care of migrations #839

Merged
merged 8 commits into from
Sep 9, 2022

Conversation

h-vetinari
Copy link
Member

@h-vetinari h-vetinari commented Sep 8, 2022

Hey all

There's currently a few tricky migrations pending (abseil, grpc, libprotobuf), and since arrow plays such a central role in the ecosystem nowadays, it's good to keep this moving along across all ABI branches. Not least since I also help co-maintain those other feedstocks, I'd be happy to help take care of these migrations, and this would be easier as a maintainer (can push into bot PRs, no need for attention on trivial PRs).

I thought I'd propose joining first (considering the quantity of open migration PRs), but if desired I can also try raising independent PRs first.

PS. I'm not completely new to this feedstock, c.f. #247 & collab on conda-forge/pyarrow-feedstock#101. :)

PPS. A standard rerender seems to bring in the (now-closed) orc migration; if desired I can undo this part.

Closes #826

@conda-forge-linter
Copy link
Contributor

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipe) and found it was in an excellent condition.

@xhochy
Copy link
Member

xhochy commented Sep 8, 2022

@h-vetinari Can you please also apply 6643cec here to limit parallelism on Azure? Otherwise, we will block the whole of conda-forge.

@h-vetinari
Copy link
Member Author

@h-vetinari Can you please also apply 6643cec here to limit parallelism on Azure? Otherwise, we will block the whole of conda-forge.

Fine in principle, but that's gonna make it very painful to move through all the migrations (though as soon as we do the abseil one, we'll be dropping half the jobs) - I've worked concurrently on big CI matrices in conda-forge before (e.g. the BLAS-testing PRs in numpy/scipy) and never noticed that other feedstocks didn't start, so I'm not sure where the limit lies, but it seems to be pretty high.

Do you know if the max_parallel: is per platform (in which case I would say it's barely doable), or for the whole pipeline (in which case I think the limit is not reasonable).

PS. Windows builds here need conda-forge/admin-requests#482.

@h-vetinari
Copy link
Member Author

Do you know if the max_parallel: is per platform

Looks like it's for the whole pipeline: https://github.com/conda-forge/conda-smithy/blob/v3.21.1/conda_smithy/configure_feedstock.py#L1343-L1345

Currently there are 70 jobs, and the aarch/ppc ones take almost 5-6h based on the last successful run. With a limit of 8, that would mean waiting up to >48h for one CI job to finish, which is not reasonable IMO (considering the number of branches & migrations outstanding). I think 30 parallel jobs would be manageable, both for waiting time & the rest of conda-forge.

@xhochy
Copy link
Member

xhochy commented Sep 8, 2022

Do you know if the max_parallel: is per platform

Looks like it's for the whole pipeline: https://github.com/conda-forge/conda-smithy/blob/v3.21.1/conda_smithy/configure_feedstock.py#L1343-L1345

Currently there are 70 jobs, and the aarch/ppc ones take almost 5-6h based on the last successful run. With a limit of 8, that would mean waiting up to >48h for one CI job to finish, which is not reasonable IMO (considering the number of branches & migrations outstanding). I think 30 parallel jobs would be manageable, both for waiting time & the rest of conda-forge.

Then set it to 30 jobs for now. We should limit the parallelism on this feedstock though as very briefly summarised in #825

@h-vetinari
Copy link
Member Author

Then set it to 30 jobs for now. We should limit the parallelism on this feedstock though as very briefly summarised in #825

OK, that makes sense. I'll be mindful not to open/merge too many PRs concurrently, and I'll lower the limit on the abi-migration branches once the migrations are through, so that when the bot comes with 3-4 PRs at once for a migration, we're not starting 100+ jobs at the same time.

@h-vetinari
Copy link
Member Author

@xhochy, regarding the clang version used here, I needed to bump away from 10 at least on windows (VS2019 needs clang>=11), and decided to give it a shot with 14. It seems this is working without issue, but does this need to be checked somehow, or should I restrict this bump to windows?

I saw in the build scripts that there were some comments along the lines of "We need llvm 11+ support in Arrow for this" (for osx-arm), but since that commit is almost 2 years old, I guess it might be time to revisit this?

Ideally, the osx-arm parts would be changed in a different PR. It's gonna be another ~10h until the aarch/ppc builds are through, and merging this one would allow me to tackle the other branches in the meantime.

@h-vetinari
Copy link
Member Author

All the failures / timeouts here are flaky and can be fixed by restarts after merging (if they reoccur). IMO this is good to go (pending the clang 10->14 question).

@xhochy xhochy merged commit ec02818 into conda-forge:main Sep 9, 2022
@xhochy
Copy link
Member

xhochy commented Sep 9, 2022

I saw in the build scripts that there were some comments along the lines of "We need llvm 11+ support in Arrow for this" (for osx-arm), but since that commit is almost 2 years old, I guess it might be time to revisit this?

Yes 😂

@h-vetinari h-vetinari deleted the join branch September 9, 2022 07:12
@h-vetinari
Copy link
Member Author

I saw in the build scripts that there were some comments along the lines of "We need llvm 11+ support in Arrow for this" (for osx-arm), but since that commit is almost 2 years old, I guess it might be time to revisit this?

Yes 😂

Let's tackle this after the migrations. 🙃

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants