-
Notifications
You must be signed in to change notification settings - Fork 362
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Switch default python to 3.10 #1219
Conversation
3.7 is *quite* old, and folks get caught up in it because it also sometimes forces much older versions of packages - and it can be quite confusing to debug (see 2i2c-org/infrastructure#1934) for example.
How do we handle communications of this change to the Jupyter and mybinder.org community? This may break some repos that don't pin a Python version if they rely on a package version that's not built for 3.10. Long term do you have any thoughts on how we can make this process better? We're going to run into the same problem of breaking repos with future bumps, or when we bump the base image from Ubuntu 18.04 to a newer version. |
There is an issue (somewhere :-/) about the fact that for real reproducibility people need to have the triple: repo link, revision of the code and revision of repo2docker. We discussed adding a feature to repo2docker that would allow it to fetch a version of itself and use that instead of what the user installed when building the container (something like fetch the container image for that revision of r2d). In the past we have broken the "reproducibility promise" a few times already. For example when we switched to jupyter lab as default and some other, rare, cases. In general, I think because the universe keeps evolving it is already likely that (very) old revisions of a repo will not build or lead to a container that is quite different. Despite this I think we should try hard not to add to this source of entropy by going wild with breaking changes in r2d. |
I've created a new issue for the big picture of long-term support #1220 |
I left a more detailed comment in #1220, but I think it is appropriate to bump the default version at least when it is old enough that it's starting to cause failures. We definitely do need to keep updating the default version. I'm not sure what's the best strategy would be between:
or some other strategy. |
I'm not sure which of the proposed options to choose/vote for :-/ While thinking about it I ended up thinking that we should try to make it a semi regular thing (changing default Python version). My reasoning was that we can either make it extremely rare, like every five years, or make it fairly regular. The weird/danger zone is the middle ground. If a change like this is extremely rare, it is "by construction" a big bang event. This requires docs, patience, giving people enough prior warning, etc. However because you "never" do it, that is Ok. If it happens quite regularly, then people are used to it, expecting it, have a good "institutional memory" and maybe even some tooling to help. In the middle ground it seems we end up with the worst of the two other extremes. It is a big deal that requires prior warning, patience, etc because people have forgotten that this is a thing. It happens frequently enough that it is annoying to have to spend so much effort on it. The hard bit is, of course, figuring out what "quite regularly" means in terms of human time passing. Riding on the Python release schedule seems like a good thing to do. They seem to be pretty regular, well known, expected, etc. Maybe once a year is a good rhythm? Maybe being "slow", as in, sticking with "oldest reasonable version" is good for people in the reproducibility business? Though maybe, assuming that more repos receive r2d support now than in the past, it means the time till these repos hit the "oups, we didn't specify a Python version but we should have" is "far in the future". Which means the original authors have left/moved on/forgotten about that repo? Maybe I'm thinking too much without actual data :-/ |
Given Python's rapid annual release cadence, I think it's perhaps most logical for us to match (maybe lag by exactly one version, e.g. when Python releases 3.12, we bump default to 3.11 and do this every Fall?). I think it's also appropriate to put pressure on repos, via warnings and docs, to specify their Python version. many repos that just need numpy, pandas, etc. shouldn't specify Python version (if so, they shouldn't specify _any_versions at all). But if you specify any version of anything, specifying the Python version is a good idea, too. |
I've posted on Discord to solicit additional feedback: |
It's been two weeks and no-ones responded on the Discourse post, so I think we should go ahead with this. |
Merged from main and re-ran freeze |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There's a py310-requirements-file
test in
https://github.com/jupyterhub/repo2docker/tree/738a56dcd5b5169b14ed5b535514cf4cf3fb8e63/tests/conda/
that's intended to test the newest version of Python supported. With this PR it's redundant, but it should be updated when Python 3.11 is added (#1239).
When we merge this to mybinder.org I think we should post an announcement on Discord (or we could even consider announcing it in advance?).
What do we do about the failing external test? Update the pins? |
Good spot, I thought it was a transient error but it looks real 😞
The repository doesn't specify the required version of Python. However you could argue that since it pins some of the packages the repository wants "any version of Python that enables the requested package versions to be installed". This wouldn't work for pip, but it works for conda because python is just another package who's dependencies can be taken into account. If we want to support this we probably need
|
Looking at the comment for the xeus-cling test (added in #373), it appears to be meant to verify that downgrades are allowed, but the failure suggests that the downgrade doesn't work. I think downgrade of most packages is allowed, but Python itself is not allowed to be downgraded (by minor version number, at least). I'm not sure where this is enforced, if it's something we do or something in conda/mamba. In any case, I believe the test failing is either correct: implicit downgrade of Python should work and doesn't, or doesn't test what we want it to anymore: implicit downgrade of other packages while keeping the chosen (even if not by the user) Python in tact. So either we should figure out how to unpin Python (I don't think this has ever worked), or find another repo which triggers a downgrade of something other than Python. I think this will work if we copy the xeus-cling environment and add I also just noticed while testing that this PR didn't actually change the default Python to 3.10 in some important cases! There are two ways to get at a 'default' python env - load the |
this default lives too many places!
rather than building old xeus-cling, which requires a downgrade of Python from 3.10 to 3.9, which is _not_ supported, run the build with a 3.9 pin. This still results in patch-level downgrade of Python, major downgrade of openssl, etc.
All green now. I've elected to bring the xeus-cling test into our repo with an added I also updated the The codecov errors are a flakiness issue in codecov itself, and not real. All tests are passing. |
This has approvals and I think it's ready to go, but since I've made noticeable changes since the approvals, I'd like at least one other person to give it a green light before merging. |
Re-synced after #1239. Now does not contain re-freeze, only the change in default |
3.7 is quite old, and folks get caught up in it because it also sometimes forces much older versions of packages - and it can be quite confusing to debug (see
2i2c-org/infrastructure#1934) for example.