-
Notifications
You must be signed in to change notification settings - Fork 16.3k
Always use shared cache, no matter if we are using uv or not #50923
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
amoghrajesh
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh! Lets see how it goes
14beca8 to
054439b
Compare
|
I had to workaround race when running |
Another attempt to fix pyspark issues. It turns out that we are not passing the `use_uv` flag in tests, so the cache has not been actually shared between paralllel containers - we change it in the way that the uv cache is shared regardless if use-uv is passed
054439b to
c8f31b2
Compare
|
Rebased to try it once more |
|
Looks like a static check failure happened now? |
Looks like issue in main |
Fixed in #50935 |
…50923) Another attempt to fix pyspark issues. It turns out that we are not passing the `use_uv` flag in tests, so the cache has not been actually shared between paralllel containers - we change it in the way that the uv cache is shared regardless if use-uv is passed
…50923) Another attempt to fix pyspark issues. It turns out that we are not passing the `use_uv` flag in tests, so the cache has not been actually shared between paralllel containers - we change it in the way that the uv cache is shared regardless if use-uv is passed
Another attempt to fix pyspark issues. It turns out that we are not passing the
use_uvflag in tests, so the cache has not been actually shared between paralllel containers - we change it in the way that the uv cache is shared regardless if use-uv is passed^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named
{pr_number}.significant.rstor{issue_number}.significant.rst, in airflow-core/newsfragments.