Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pin dask/distributed package versions after other installs in CI image #841

Merged
merged 1 commit into from
Feb 27, 2023

Conversation

karlhigley
Copy link
Contributor

The latest attempt to build the CI container is ending up with dask 2022.1.1, which I think must be happening because of subsequent installs. Moving the install farther down the file to see if that helps.

The latest attempt to build the CI container is ending up with `dask 2022.1.1`, which I think must be happening because of subsequent installs. Moving the install farther down the file to see if that helps.
@github-actions
Copy link

Documentation preview

https://nvidia-merlin.github.io/Merlin/review/pr-841

@oliverholworthy
Copy link
Member

must be happening because of subsequent installs

Looks like it's because feast 0.19 which we install later depends on dask<2022.02.0
https://github.com/feast-dev/feast/blob/v0.19.4/sdk/python/setup.py#L69

@@ -21,6 +20,10 @@ RUN pip install astroid==2.5.6 'feast<0.20' sklearn
RUN echo 'import sphinx.domains' >> /usr/local/lib/python3.8/dist-packages/sphinx/__init__.py
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your downgrade of dask happens in feast<0.20. It has an anchored requirement for dask 2022.1 So yeah adding dask reinstall at the end should do the trick.

@karlhigley karlhigley added chore Infrastructure update ci labels Feb 27, 2023
@karlhigley karlhigley merged commit 943eda5 into main Feb 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
23.03 chore Infrastructure update ci
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants