Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider not capping major versions for many of our dependencies #648

Open
choldgraf opened this issue Feb 2, 2022 · 42 comments
Open

Consider not capping major versions for many of our dependencies #648

choldgraf opened this issue Feb 2, 2022 · 42 comments

Comments

@choldgraf
Copy link
Member

choldgraf commented Feb 2, 2022

Description

Currently, we cap major versions of most dependencies. This helps ensure that we have stable behavior when dependencies release new major versions. When a new major version is released, we manually bump our cap and make sure that behavior still works as-expected in the PR.

There are obvious benefits to this approach, but I think there are also significant costs, and I wonder if we can relax this constraint a bit and take a more nuanced approach.

The problem we're introducing for users

By capping major versions, we may introduce headaches for our users. These are particularly problematic when they have complex environments with many versions installed. Specifically:

We run the risk that our dependency chain will be un-resolvable due to a conflict with some other package's dependencies.

If this happens, there's nothing that our users can do, other than wait for us to make a new release. Here's an extreme example of this with click. tl;dr: Click released a new major version, one of our dependencies required that version, and it was then impossible to install Jupyter Book in a fresh environment until we make a new release.

There are a lot of blog posts arguing back and forth about this, but here's a nice recent one that our own @chrisjsewell has chimed in on as well: Should You Use Upper Bound Version Constraints? from Henry Schreiner.

Why we shouldn't just un-cap everything

There are definitely benefits to having major version caps for the dependencies that we know will introduce breaking changes with version releases (looking at you docutils) or those that have lots of complex interconnections with our codebase that might lead to unexpected outcomes. When we cap the versions for these dependencies, we help save our users unexpected headaches.

So, I wonder if we can find a compromise and adopt a policy like the following:

Proposal: cap our own stack and unstable dependencies, and do not cap other dependencies

  • We do cap:
    • Our own EBP tools (because we know that we follow semver and can coordinate releases)
    • sphinx and docutils (because they introduce many changes, are very complex, and our stack has many connection points with them)
    • Anything else?
  • By default, we do not cap versions for other dependencies, UNLESS we determine that:
    • A dependency is known to be unstable
    • A dependency has a history of breaking changes with new versions
    • A dependency is interwoven with our stack enough that major versions will likely cause confusion or problems

For the second group, we can selectively add dependency caps if we know there's a breaking change that we can't quickly resolve, and we can make a decision whether to keep that cap if we expect subsequent releases to also introduce breaking changes.

I think that this could be a way to balance "dependability of our environment" against "constraints that cause headaches for our users". For example, in the case of click, we likely would not have used a cap because we use it in a targeted fashion, and wouldn't expect it to break our usage in the ways we're utilizing it. This would have saved us the time of then manually bumping click across our repositories, and cutting new releases.

EDIT: After some discussion below, I also propose that we add a CI/CD job for unstable versions to our most heavily-used repositories. These could be canary deployments that run against main/master and we can use this to know when a breaking change is coming down the pipeline.

Thoughts?

What do others think? Would this be a net-benefit for our time + the time of users? In particular I am curious what @chrisjsewell thinks since I know you care about UX stability for our users!

@chrisjsewell
Copy link
Member

Firstly, just to clarify:

Here's an extreme example of this with click. tl;dr: Click released a new major version, one of our dependencies required that version, and it was then impossible to install Jupyter Book in a fresh environment until we make a new release.

This issue was due to a deficiency in jazzband/pip-tools#1372 (as I just mentioned: jupyter-book/jupyter-book#1590 (comment)), not because it is actually impossible to install, e.g. using pip.

Using upper-pinnings should never prohibit jupyter-book from being installed, if a new dependency version is released, in fact it should be quite the opposite.

@chrisjsewell
Copy link
Member

So this is where I stand:

Just looking at jupyter-book for now: it is almost an "end-user tool", whereby you could actually pin every dependency to an exact version, and know that it will never break.
BUT, It's not quite, because we also want to allow "power users" to install it alongside other python packages, such as sphinx extensions. So hard pinning is probably out of the question (and also you would never get bug fix updates).

If you remove upper-pinnings, jupyter-book will always eventually break,
i.e. for a basic user, only installing jupyter-book into a fresh python environment, eventually there will be a breaking change in a dependency that will cause jupyter-book to fail in some way.

Those users will then come to us and say "hey jupyter-book is broken": what do we then do?
Do we say; well that's not our problem, you need to create your own requirements file, pinning all the problematic dependencies, until we have time to update jupyter-book and release a new jupyter-book, and also you then need to remember to unpin these dependencies when a new release of jupyter-book is actually released.

Is this really what you want to be telling every user of jupyter-book, i.e. that it's not enough to simply install jupyter-book and expect it to work?

Now, its true that pinning major versions does not "guarantee" that a dependency update will never break jupyter-book; it is quite possible for dependencies to (inadvertently or otherwise) introduce breaking changes in minor versions.
But it at least gives you a fighting chance, whilst still allowing for some flexibility in version compatibility.

If you have upper-pinnings, it is indeed true, that eventually there will be (primarily power) users that encounter an issue with wanting to use the latest version of another python package, alongside jupyter-book.
They will indeed have to wait for us to update/relax our upper pinnings, before they can do this.
I understand their likely frustration, but at the same time feel this will affect a significantly smaller portion of the user base

@chrisjsewell
Copy link
Member

chrisjsewell commented Feb 2, 2022

Basically, the two extremes here are:

  1. You provide jupyter-book as completely hard/exact pinned, for every dependency (and python version); we can "reasonably" guarantee it will never break, but it probably can't be used alongside any other packages, and also only new jupyter-book versions will get dependency bug fixes, etc. This is probably what a lot of "basic/non-technical" users want (i.e. simplicity/stability).

  2. You provide jupyter-book as completely unpinned for upper bounds, for every dependency (and python version); it basically cannot be used without an accompanying "lock file" (requirements file specifying all the pins), and we would provide no guarantees of it working. It would though allow power users the flexibility to try getting it to work with any python package they wanted.

In principle, you could have these as two corresponding releases of jupyter-book, e.g. jupyter-book and jupyter-book-unpinned

@choldgraf
Copy link
Member Author

choldgraf commented Feb 5, 2022

I think that you raise a lot of good points - agreed we don't want Jupyter Book to become unstable for people because our upstream dependencies introduce breaking changes in unexpected ways. As you say, it's a balance between these extremes.

Quick clarification on the dependencies our users likely bring with them

One clarification: maybe I am not understanding the way dependencies work in Jupyter Book, but I think that installing jupyter-book alongside many other dependencies is not restricted to power users. Our target user personas are most likely "data scientist types, often in research and education communities".

In my experience, this user persona does almost no environment management. They just use a single kitchen sink environment and pip/conda install things into it over time, with no thought given to versions, dependency chains, etc. The power users are the ones that use something like multiple conda environments, virtualenv, etc. Even if they did do some environment management, they still need all of the packages to execute their content installed in the same environment as Jupyter Book, right?

So I would think that our users in particular are likely to be vulnerable to "kitchen sink environments causing dependency constraint conflicts". (that said, we have had a few reports of this but not a ton, so I don't know how much of an issue it is).

Investigation into other projects in adjacent ecosystems

I was asking around other projects to see how they handle this. It doesn't seem like many people use upper limits for their dependencies (e.g. Dask doesn't, scikit learn doesn't, pandas doesn't, jupyterlab uses a hybrid approach.

So what do these projects do to avoid this potential problem?

A common pattern was including a CI/CD job that explicitly tests against unstable versions of upstream dependencies. That way they know before releases are made if it will break their test suites. For example:

Speaking to folks in those communities it sounds like the workflow is roughly:

  • If their "unstable test" fails:
  • They patch it locally until the test passes (they generally don't block PRs unless the PR causes the failing test).
  • If this is an upstream regression/bug, they file an issue or upstream a PR to fix it.
  • They either delay their release until the unstable test is passing (so that they know the next upstream release won't break it) or they temporarily pin the upstream and un-pin it when the problem is resolved.

Maybe this could be a nice balance between "make sure our users don't have instability" and "don't constrain our users' dependency chains too strongly". Our tech stack is a bit different so I'm not sure how this would work with, eg, our regression tests w html and such, but maybe worth a try?

Curious what you/others think about it.

@chrisjsewell
Copy link
Member

In my experience, this user persona does almost no environment management. They just use a single kitchen sink environment and pip/conda install things into it over time, with no thought given to versions, dependency chains, etc.

Then you can't really use https://iscinumpy.dev/post/bound-version-constraints as a reference point, since the author specifically states in the comments:

if someone knows how to work with code, they should know how to install packages, pin dependencies, and hopefully, knows how to use a locking package manger or lock file. If they don't, you probably have more problems than just this one

i.e. a major point of the thesis, is that you put the responsibility of version management on the user

@chrisjsewell
Copy link
Member

Even if they did do some environment management, they still need all of the packages to execute their content installed in the same environment as Jupyter Book, right?

Well, your kernel for code execution can be entirely separate from your execution environment: https://ipython.readthedocs.io/en/stable/install/kernel_install.html#kernels-for-different-environments.
It's really the better way to do it, to separate concerns (book building vs code execution), and you could even have different environments specialized for different notebooks.
Although, I don't think this is possible via ReadTheDocs unfortunately

@bryanwweber
Copy link

@choldgraf invited me to weigh in here from Twitter, thanks 😄

My perspective is that I am 99% against upper-level pins, and only not 100% because "never say never". I personally struggled with the pins in jupyter-book and related projects as I was developing MyST-NB-Bokeh because I couldn't install a dev version of any of the dependencies because they were pinned so tightly. I had to install everything with --no-deps to get an environment that pip could solve. So that wasn't a fantastic experience, although granted it was a little bit outside the normal user experience.

I'll also reply to a couple of comments from the thread that stood out to me.

Our target user personas are most likely "data scientist types, often in research and education communities".

I suspect this target persona does a fair amount of environment management, especially if they use conda. I think this is generally taught as "best practice" when you're learning conda. That said, they can still run into conflicts between dependencies of jupyter-book and any dependencies to run their code, especially Sphinx/Jupyter extensions as @chrisjsewell noted.

Do we say; well that's not our problem, you need to create your own requirements file, pinning all the problematic dependencies, until we have time to update jupyter-book and release a new jupyter-book

Yes, I think this is what you should say. But instead of creating a requirements file, say "please do pip install dependency==version until we fix this. Then run pip install -U jupyter-book dependency when we release the new version" and it's all better.

Even if they did do some environment management, they still need all of the packages to execute their content installed in the same environment as Jupyter Book, right?

Well, your kernel for code execution can be entirely separate from your execution environment: https://ipython.readthedocs.io/en/stable/install/kernel_install.html#kernels-for-different-environments. It's really the better way to do it, to separate concerns (book building vs code execution), and you could even have different environments specialized for different notebooks.

So your position is that environment management/requirements.txt file is too complicated or not desirable, but multiple execution environments for different notebooks is feasible? That doesn't seem consistent to me...

Now, its true that pinning major versions does not "guarantee" that a dependency update will never break jupyter-book; it is quite possible for dependencies to (inadvertently or otherwise) introduce breaking changes in minor versions. But it at least gives you a fighting chance, whilst still allowing for some flexibility in version compatibility.

If you have upper-pinnings, it is indeed true, that eventually there will be (primarily power) users that encounter an issue with wanting to use the latest version of another python package, alongside jupyter-book. They will indeed have to wait for us to update/relax our upper pinnings, before they can do this. I understand their likely frustration, but at the same time feel this will affect a significantly smaller portion of the user base

I doubt this is true. I think if you're at a point where you're sharing enough content that you want to put it into jupyter-book, you're already a reasonably advanced user. I also don't agree that pinning major versions gives you a fighting chance; as @choldgraf noted if any dependency starts to rely on a dependency that you don't support, no one can install JupyterBook. Even without that case, it just defers the headache to your users instead - and especially to power users who could otherwise be evangelizing for your software.

As to the proposal by @choldgraf:

Proposal: cap our own stack and unstable dependencies, and do not cap other dependencies

I think capping the EB stack in jupyter-book is a reasonable stance to take, since it really is meant more as an end user application, although still somewhat annoying. At least I can do pip install jupyter-book && pip uninstall jupyter-book && pip update any deps I need && pip install --no-deps jupyter-book or conda install --dependencies-only (I think that's a flag...).

Capping dependencies further down the stack is much harder to overcome. Now I need to fork that package, bump its dependencies, and then install from my fork if I want to update anything. Since those (MyST-NB, MyST-parser, etc.) are (to me) meant more as libraries that folks can install, the chances of having dependency conflicts is much higher.

By the way, feel free to pin docutils with == to whatever version works. And then rip it out and replace it wholesale one day 😄

@chrisjsewell
Copy link
Member

Heya,

as @choldgraf noted if any dependency starts to rely on a dependency that you don't support, no one can install JupyterBook

That's incorrect, you can't "start to rely on a dependency" without releasing a new version, then jupyter book could still be installed with the old version of the dependency

@chrisjsewell
Copy link
Member

I couldn't install a dev version of any of the dependencies because they were pinned so tightly.

Can you give a dev version of what you were trying to install?

@bryanwweber
Copy link

Heya,

as @choldgraf noted if any dependency starts to rely on a dependency that you don't support, no one can install JupyterBook

That's incorrect, you can't "start to rely on a dependency" without releasing a new version, then jupyter book could still be installed with the old version of the dependency

Only if that old version is still available... It could get yanked due to a security vulnerability or any other reason, really. But this is a side point anyways.

I couldn't install a dev version of any of the dependencies because they were pinned so tightly.

Can you give a dev version of what you were trying to install?

I don't remember the versions precisely, but it was equivalent to jupyter-book had the dependency jupytext~=1.11.3 but I wanted to install jupytext==1.12.0-dev to test against. Something like that, I don't recall exactly which package it was. I can go look if you're interested in more specifics

@akhmerov
Copy link
Contributor

akhmerov commented Feb 6, 2022

As a casual user I did run into too strict pinning by various jupyter-book components as well. In my case it was nothing dramatic—more of a minor nuisance, especially since forcing pip to ignore the pinning worked just fine™.

I think another cost for the end user that isn't mentioned is the inability to get bugfixes in the dependencies if they land in the newer versions. While larger packages would backport bugfixes, in practice this doesn't happen too often in the ecosystem.

Well, your kernel for code execution can be entirely separate from your execution environment: https://ipython.readthedocs.io/en/stable/install/kernel_install.html#kernels-for-different-environments. It's really the better way to do it, to separate concerns (book building vs code execution), and you could even have different environments specialized for different notebooks.

I agree that it is cleaner to separate book building from code execution, but it adds maintenance complexity, especially to CI. In practice I expect that hardly anybody does this. Is there an EBP repository that can serve as an example of this more systematic approach?

@chrisjsewell
Copy link
Member

I think another cost for the end user that isn't mentioned is the inability to get bugfixes in the dependencies if they land in the newer versions.

Pinning to major versions means you get every bug fix, from minor and patch releases

@akhmerov
Copy link
Contributor

akhmerov commented Feb 6, 2022

Pinning to major versions means you get every bug fix, from minor and patch releases

Of course. I expect, however, that some bugfixes land in major releases.

@moorepants
Copy link

I just landed here and only read the first post in this thread. When packiagng the various executable book projects in conda forge, these upper bounds are often problematic for the conda solver. Taking them away would give the SAT solvers more flexibility in finding solutions that are likely to still result in consistent functioning environments for users. This would give less headaches for downstream packagers, environment managers, and users.

@moorepants
Copy link

One option is that you don't add upper bounds on dependencies in general, but only add them to bugfix releases when you know a dependency update breaks that past released version.

Example:

Release version 4.1.0 of your package that depends on NumPy. You know 4.1.0 only works with numpy >=1.3.0 in the present so that is the pin you set. At some point NumPy releases 1.13 and that breaks your package's version 4.1.0. So you go back and release a 4.1.1 with an upper pin numpy >=1.3,<1.13.

So instead of trying to predict what future versions might break your packages in the present, just wait till a dependency does break one of your old versions and bugfix it.

This would allow flexible package manager solves as time moves on, but also gives you the control to keep the latest bugfix releases working for as many past versions of your software as you want to maintain.

This wouldn't work for anyone installing with exact version pins to your software, but it would work for anyone installing with X.* or X.X.*. If users are installing with exact version pins, then they should be (or are probably) using a lock file type setup so that the whole tree of dependencies is pinned. Otherwise those users would need to watch out for bug fix releases for the version of your software they are using.

@chrisjsewell
Copy link
Member

At some point NumPy releases 1.13 and that breaks your package's version 4.1.0. So you go back and release a 4.1.1 with an upper pin numpy >=1.3,<1.13.

Why would the solver install 4.1.1 and numpy 1.12, and not just 4.1.0 and numpy 1.13?

@moorepants
Copy link

Why would the solver install 4.1.1 and numpy 1.12, and not just 4.1.0 and numpy 1.13?

Because the user would be upgrading your software and the package manager solver tries to give the newest possible versions of all dependencies.

@chrisjsewell
Copy link
Member

Why would the solver install 4.1.1 and numpy 1.12, and not just 4.1.0 and numpy 1.13?

Which is exactly what the "Backsolving is usually wrong" section of https://iscinumpy.dev/post/bound-version-constraints/ explains, i.e. once you remove upper pinning, that's it, the only fix for breaking changes is to release a new version that supports the change, or tell users to pin

@chrisjsewell
Copy link
Member

Because the user would be upgrading your software and the package manager solver tries to give the newest possible versions of all dependencies.

Why is 4.1.0 + 1.13 not newer than 4.1.1 + 1.12, when considering all dependencies?

@chrisjsewell
Copy link
Member

Just want to clarify here, I am not advocating for every package upper pinning, I see the arguments for libraries.
But I'm stressing, there is a big difference between something like markdown-it-py and jupyter-book; the specific goal of jupyter-book is to make things as easy as possible for non-technical users. If you are a developer and you are using jupyter-book, IMHO I'd suggest you shouldn't be (I don't), you should be using the "lower level" myst-nb etc, jupyter-book really does nothing but collate them in an opinionated application (and part of that opinion is the pinning)
and on that note, another point I want to make is that, the best way to avoid dependency issues is not to have any 😄
With myst-nb, I will soon remove nbcobvert, jupyter-sphinx and ipywidgets, I've already removed nbdime for jupyter-cache, and am working on removing attrs for markdown-it-py

@moorepants
Copy link

moorepants commented Feb 12, 2022

Why is 4.1.0 + 1.13 not newer than 4.1.1 + 1.12, when considering all dependencies?

It isn't, but we don't consdier all dependencies in Python (at least not for libraries and often not for apps either). My suggestion assumes the user is trying to keep some small set of packages at the top of their stack up-to-date and functioning, either one package or a collection of packages. If I'm using your package in my environment I have n packages I directly depend on. I only care that the API from those n packages works for what I'm doing (user or library viewpoints). Everything my n packages depend on can be any version as long as the API for the n packages is fixed and the dependency stack fits the constraints of all packages involved.

There is the rare situation that I need API from your 4.1.* and API from numpy 1.18.* I'd be out of luck at that point and have to modify my code to work with a newer version of your package or older versions of numpy. No way around that.

With NPM you could have a numpy 1.18 and numpy 1.12 installed in the same dependency tree, but you can't for pip or conda. The later tries to provide the latest versions of all packages in the entire tree given a desired state of a subset of packages you specify in your requirements.txt, setup.py, pyproject.toml, environment.yml, etc.

I'm suggesting something similar to metadata patches, which is also mentioned in the blog post:

There is actually a solution for this outside of PyPI/pip packages; you can support metadata patches, allowing a release to be modified afterwards to add new compatibility constraints when they are discovered. Conda supports this natively; you can also approximate this with post releases and yanking in PyPI, but it’s a pain.

We can't patch the metadata but we can release a bugfix with the upper bounds. But yes, things could still break for a user because any dependency in the stack could have a incompatibility that conflicts with the new upper bound you add. But that is often rare and honestly, people just have to upgrade their software to be at versions that are compatible in present time. It's very hard to maintain compatibility for old software stacks without exact pins (and even impossible. For example, try installing Plone 2 from the early 2000s with buildout which does do exact pins).

@moorepants
Copy link

moorepants commented Feb 12, 2022

But I'm stressing, there is a big difference between something like markdown-it-py and jupyter-book

There may be a big difference if you think of them each as the only package you are trying to get working in a given "top of the stack" environment (environment for me = collection of consistent packages). A user may be using jupyter book to explain concepts that require a large number of software packages, so they also need to have all that software installed along with jupyter book in the same environment because jupyter book executes code cells in the book. So adding hard constraints on jupyter book's dependencies means it could then be impossible to setup the environment you need for your book.

@moorepants
Copy link

so they also need to have all that software installed along with jupyter book in the same environment because jupyter book executes code cells in the book

Jupyter book could have fixed pins if you execute the jupyter cells in a different environment. That should be possible through selecting the right kernel (right?). If that can be done, then you can have two environments: 1) an environment with only jupyter book and 2) an environment with all the software you need to execute cells in your book.

@chrisjsewell
Copy link
Member

so they also need to have all that software installed along with jupyter book in the same environment because jupyter book

I've already explained why this is not actually the case; a core concept of jupyter is having the kernel (where all the software is installed) separate from the client environment, i.e. where jupyter-book is installed. We should make this easier for users to achieve

@moorepants
Copy link

moorepants commented Feb 12, 2022

a core concept of jupyter is having the kernel (where all the software is installed) separate from the client environment

It is, but it's also a very confusing and unusual concept for most users. Setting up different virtual environments is a tough concept for average users too. Most expect conda install x and pip install x to just work and most have all their packages in the base environment.

@chrisjsewell
Copy link
Member

Setting up different virtual environments is a tough concept for average users too.

Which is exactly why I said we should make it easier 😉

@chrisjsewell
Copy link
Member

chrisjsewell commented Feb 18, 2022

Sorry, couldn't resist 😅, but over at https://github.com/pallets/jinja and https://github.com/pallets/markupsafe this kind of highlights my "fear"; within a few hours of a marksafe release that breaks a version of jinja , they are now inundated with people telling them their package is broken (many because readthedocs builds are suddenly failing):

image

image

Fair, they are telling people:

You are using an unsupported version of Jinja, please update to the latest version ... then use a tool like pip-tools to pin your dependencies

But, I doubt that's going to stop them getting many, many more "complaints" before this is over

@bryanwweber
Copy link

Sorry, couldn't resist 😅, but over at https://github.com/pallets/jinja and https://github.com/pallets/markupsafe this kind of highlights my "fear"; within a few hours of a marksafe release that breaks a version of jinja , they are now inundated with people telling them their package is broken

Yep, that's a fair concern, I don't think anyone disputes that this could happen. However, the EB projects are not nearly at the level of adoption that Jinja is at, and I think you're optimizing for the wrong problem at the current moment in the EB hype cycle. Besides, what's the worst case? You end up with a few dozen issues that you have to close? Even a few hundred could be handled in not too long a time, especially if it's a generic response like "Please upgrade".

@chrisjsewell
Copy link
Member

Even a few hundred could be handled in not too long a time, especially if it's a generic response like "Please upgrade".

As long as one of you guys is willing to man the issue boards when this happens 😅

But it is not just the problem of closing issues with a generic response;
even if you think those that opened the issues are "in the wrong", do you really think most of them will see it that way?
I just can't see this being a particularly positive experience for users.

@bryanwweber
Copy link

As long as one of you guys is willing to man the issue boards when this happens 😅

Sure, happy to help! 😃

But it is not just the problem of closing issues with a generic response;
even if you think those that opened the issues are "in the wrong", do you really think most of them will see it that way?
I just can't see this being a particularly positive experience for users.

Yeah, perhaps not ideal, but given specific instructions for how to resolve it, I don't see it as too negative. And anyway, that particular issue is theoretical at the moment for EB users, whereas the issue of problems because of the capped pins is real and being felt by at least three power users in this thread.

@chrisjsewell
Copy link
Member

that particular issue is theoretical at the moment for EB users

Yes, but it's theoretical now, largely because of the pinning, that's the point; I've seen very few issues for general users, with our current setup, but that won't be theoretical once pins are removed.

Which again, (simplistically) comes back to: "do the wants of the few outweigh the need of the many"
Personally, I would take three unhappy power users, over 100/1000s of potentially unhappy general users.
(although I would prefer no one was unhappy 😅)

Yeah, perhaps not ideal, but given specific instructions for how to resolve it, I don't see it as too negative

You are obviously in the other camp.
That's completely understandable, since you are a developer.
It's going to be difficult to get a representative non-technical/technical opinion on this discussion, given that it is on a forum that mostly only technical users frequent.

@choldgraf
Copy link
Member Author

choldgraf commented Feb 22, 2022

I'm also happy to help triage issues and close them down if we start heading down that road.

It sounds like there are reasonable arguments to make on both sides, and it's a question of which user groups we want to inconvenience, and to what degree. One thing we should do is think strategically about the different types of users, and their role in the project at large. For example, power users are smaller in number, but usually play a larger role evangelizing for a project, and are more likely to participate in the maintainer pipeline by making contributions. On the other hand the broader user community is much larger and often a "silent majority" as Chris suggests above, and you don't want to forget about those folks.

Here's a quick thought - one way we could relieve some of this pressure is by keeping our version pinning as close as possible to latest for our dependencies. Perhaps we can facilitate this by doing two things:

  1. More quickly bumping dependencies when upstreams are released
  2. More quickly making new releases when we bump those dependencies

If we used something like the GitHub dependabot, this could make it easier to quickly note when a new release has been made, merge it if tests pass, and cut a patch release assuming our APIs won't break. I think the only potential downside is if there are so many PRs that it creates noise, but we could set it on a schedule to run fairly infrequently. This might be a relatively simple step forward here without drastically changing our pinning approach.

@rossbar
Copy link

rossbar commented Mar 3, 2022

FWIW there have been multiple times where setting up an environment with executablebooks projects has proven problematic, see e.g. executablebooks/MyST-NB#289, executablebooks/MyST-NB#333. I can't say specifically that this is due to the upper-bound pinning problem; all I can say is that anecdotally I've experienced more dependency problems within the executablebooks umbrella than other projects/ecosystems that I work with.

As another example, I recently tried to set up a development environment to make a contribution to MyST-NB, following the instructions in the contributor guide:

$ git clone https://github.com/executablebooks/MyST-NB
$ cd MyST-NB
$ git checkout master
# Create a clean virtual env
$ python -m venv mystnb-dev
$ source mystnb-dev/bin/activate
$ pip list  # Verify environment is empty
Package    Version
---------- -------
pip        22.0.3
setuptools 60.7.1
wheel      0.37.1
$ pip install -e .[code_style,testing,rtd]

The output of the last step is included in full here:

Output of pip install -e .[code_style,testing,rtd]
Obtaining file:///home/ross/repos/MyST-NB
  Installing build dependencies ... done
  Checking if build backend supports build_editable ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting pyyaml
  Downloading PyYAML-6.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (682 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 682.2/682.2 KB 23.5 MB/s eta 0:00:00
Collecting docutils<0.18,>=0.15
  Downloading docutils-0.17.1-py2.py3-none-any.whl (575 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 575.5/575.5 KB 41.9 MB/s eta 0:00:00
Collecting jupyter-sphinx~=0.3.2
  Downloading jupyter_sphinx-0.3.2-py3-none-any.whl (20 kB)
Collecting sphinx-togglebutton~=0.3.0
  Downloading sphinx_togglebutton-0.3.0-py3-none-any.whl (7.8 kB)
Collecting ipython
  Downloading ipython-8.1.1-py3-none-any.whl (750 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 750.3/750.3 KB 43.0 MB/s eta 0:00:00
Collecting jupyter-cache~=0.4.1
  Downloading jupyter_cache-0.4.3-py3-none-any.whl (31 kB)
Collecting myst-parser~=0.15.2
  Downloading myst_parser-0.15.2-py3-none-any.whl (46 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 46.3/46.3 KB 173.2 MB/s eta 0:00:00
Collecting ipywidgets<8,>=7.0.0
  Downloading ipywidgets-7.6.5-py2.py3-none-any.whl (121 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 121.8/121.8 KB 214.8 MB/s eta 0:00:00
Collecting sphinx<5,>=3.1
  Downloading Sphinx-4.4.0-py3-none-any.whl (3.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.1/3.1 MB 40.8 MB/s eta 0:00:00
Collecting importlib-metadata
  Downloading importlib_metadata-4.11.2-py3-none-any.whl (17 kB)
Collecting nbconvert<7,>=5.6
  Downloading nbconvert-6.4.2-py3-none-any.whl (558 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 558.8/558.8 KB 55.8 MB/s eta 0:00:00
Collecting nbformat~=5.0
  Downloading nbformat-5.1.3-py3-none-any.whl (178 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 178.9/178.9 KB 265.5 MB/s eta 0:00:00
Collecting ipykernel~=5.5
  Downloading ipykernel-5.5.6-py3-none-any.whl (121 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 121.0/121.0 KB 214.3 MB/s eta 0:00:00
Collecting matplotlib
  Downloading matplotlib-3.5.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.9/11.9 MB 52.0 MB/s eta 0:00:00
Collecting jupytext~=1.11.2
  Downloading jupytext-1.11.5-py3-none-any.whl (292 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 292.9/292.9 KB 86.0 MB/s eta 0:00:00
Collecting bokeh
  Downloading bokeh-2.4.2-py3-none-any.whl (18.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 18.5/18.5 MB 50.1 MB/s eta 0:00:00
Collecting plotly
  Downloading plotly-5.6.0-py2.py3-none-any.whl (27.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 27.7/27.7 MB 53.8 MB/s eta 0:00:00
Collecting alabaster
  Downloading alabaster-0.7.12-py2.py3-none-any.whl (14 kB)
Collecting pandas
  Downloading pandas-1.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.7/11.7 MB 51.5 MB/s eta 0:00:00
Collecting sympy
  Downloading sympy-1.9-py3-none-any.whl (6.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.2/6.2 MB 54.4 MB/s eta 0:00:00
Collecting coconut~=1.4.3
  Downloading coconut-1.4.3-py2.py3-none-any.whl (112 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 112.6/112.6 KB 211.1 MB/s eta 0:00:00
Collecting sphinx-panels~=0.4.1
  Downloading sphinx_panels-0.4.1-py3-none-any.whl (74 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 74.3/74.3 KB 285.1 MB/s eta 0:00:00
Collecting altair
  Downloading altair-4.2.0-py3-none-any.whl (812 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 812.8/812.8 KB 36.5 MB/s eta 0:00:00
Collecting numpy
  Downloading numpy-1.22.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (16.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16.8/16.8 MB 52.5 MB/s eta 0:00:00
Collecting sphinx-book-theme~=0.1.0
  Downloading sphinx_book_theme-0.1.10-py3-none-any.whl (94 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 94.5/94.5 KB 282.0 MB/s eta 0:00:00
Collecting sphinxcontrib-bibtex
  Downloading sphinxcontrib_bibtex-2.4.1-py3-none-any.whl (38 kB)
Collecting sphinx-copybutton
  Downloading sphinx_copybutton-0.5.0-py3-none-any.whl (12 kB)
Collecting pre-commit~=2.12
  Downloading pre_commit-2.17.0-py2.py3-none-any.whl (195 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 195.7/195.7 KB 90.0 MB/s eta 0:00:00
Collecting pandas
  Downloading pandas-1.3.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.5/11.5 MB 51.8 MB/s eta 0:00:00
Collecting matplotlib
  Downloading matplotlib-3.3.4.tar.gz (37.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 37.9/37.9 MB 48.8 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
Collecting coverage<5.0
  Downloading coverage-4.5.4.tar.gz (385 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 385.2/385.2 KB 70.0 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
Collecting pytest~=5.4
  Downloading pytest-5.4.3-py3-none-any.whl (248 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 248.1/248.1 KB 41.3 MB/s eta 0:00:00
Collecting pytest-regressions
  Downloading pytest_regressions-2.3.1-py3-none-any.whl (22 kB)
Collecting pytest-cov~=2.8
  Downloading pytest_cov-2.12.1-py2.py3-none-any.whl (20 kB)
Collecting ipython
  Downloading ipython-7.32.0-py3-none-any.whl (793 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 793.9/793.9 KB 40.6 MB/s eta 0:00:00
Collecting pygments>=2.3.1
  Downloading Pygments-2.11.2-py3-none-any.whl (1.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.1/1.1 MB 53.3 MB/s eta 0:00:00
Collecting cPyparsing<2.4.5.0.1.2,>=2.4.5.0.1.1
  Downloading cPyparsing-2.4.5.0.1.1.tar.gz (817 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 817.1/817.1 KB 40.8 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
Collecting prompt-toolkit>=1
  Downloading prompt_toolkit-3.0.28-py3-none-any.whl (380 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 380.2/380.2 KB 61.9 MB/s eta 0:00:00
Collecting jupyter-client
  Downloading jupyter_client-7.1.2-py3-none-any.whl (130 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 130.3/130.3 KB 235.6 MB/s eta 0:00:00
Collecting ipython-genutils
  Downloading ipython_genutils-0.2.0-py2.py3-none-any.whl (26 kB)
Collecting tornado>=4.2
  Downloading tornado-6.1.tar.gz (497 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 497.4/497.4 KB 52.1 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
Collecting traitlets>=4.1.0
  Downloading traitlets-5.1.1-py3-none-any.whl (102 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 102.0/102.0 KB 36.0 MB/s eta 0:00:00
Requirement already satisfied: setuptools>=18.5 in /home/ross/.virtualenvs/bar/lib/python3.10/site-packages (from ipython->myst-nb==0.13.2) (60.7.1)
Collecting pickleshare
  Downloading pickleshare-0.7.5-py2.py3-none-any.whl (6.9 kB)
Collecting backcall
  Downloading backcall-0.2.0-py2.py3-none-any.whl (11 kB)
Collecting matplotlib-inline
  Downloading matplotlib_inline-0.1.3-py3-none-any.whl (8.2 kB)
Collecting jedi>=0.16
  Downloading jedi-0.18.1-py2.py3-none-any.whl (1.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 50.4 MB/s eta 0:00:00
Collecting pexpect>4.3
  Downloading pexpect-4.8.0-py2.py3-none-any.whl (59 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 59.0/59.0 KB 273.2 MB/s eta 0:00:00
Collecting decorator
  Downloading decorator-5.1.1-py3-none-any.whl (9.1 kB)
Collecting jupyterlab-widgets>=1.0.0
  Downloading jupyterlab_widgets-1.0.2-py3-none-any.whl (243 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 243.4/243.4 KB 116.0 MB/s eta 0:00:00
Collecting widgetsnbextension~=3.5.0
  Downloading widgetsnbextension-3.5.2-py2.py3-none-any.whl (1.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 46.0 MB/s eta 0:00:00
Collecting nbclient<0.6,>=0.2
  Downloading nbclient-0.5.11-py3-none-any.whl (71 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 71.1/71.1 KB 276.4 MB/s eta 0:00:00
Collecting attrs
  Downloading attrs-21.4.0-py2.py3-none-any.whl (60 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 60.6/60.6 KB 274.5 MB/s eta 0:00:00
Collecting sqlalchemy<1.5,>=1.3.12
  Downloading SQLAlchemy-1.4.31-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 49.1 MB/s eta 0:00:00
Collecting nbdime
  Downloading nbdime-3.1.1-py2.py3-none-any.whl (5.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.3/5.3 MB 50.0 MB/s eta 0:00:00
Collecting mdit-py-plugins
  Downloading mdit_py_plugins-0.3.0-py3-none-any.whl (43 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 43.7/43.7 KB 245.6 MB/s eta 0:00:00
Collecting markdown-it-py~=1.0
  Downloading markdown_it_py-1.1.0-py3-none-any.whl (83 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 83.6/83.6 KB 46.1 MB/s eta 0:00:00
Collecting toml
  Downloading toml-0.10.2-py2.py3-none-any.whl (16 kB)
Collecting cycler>=0.10
  Downloading cycler-0.11.0-py3-none-any.whl (6.4 kB)
Collecting kiwisolver>=1.0.1
  Downloading kiwisolver-1.3.2-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 51.4 MB/s eta 0:00:00
Collecting pillow>=6.2.0
  Downloading Pillow-9.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.3/4.3 MB 51.1 MB/s eta 0:00:00
Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.3
  Downloading pyparsing-3.0.7-py3-none-any.whl (98 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 98.0/98.0 KB 300.4 MB/s eta 0:00:00
Collecting python-dateutil>=2.1
  Downloading python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 247.7/247.7 KB 281.6 MB/s eta 0:00:00
Collecting mdit-py-plugins
  Downloading mdit_py_plugins-0.2.8-py3-none-any.whl (41 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 41.0/41.0 KB 236.3 MB/s eta 0:00:00
Collecting jinja2
  Downloading Jinja2-3.0.3-py3-none-any.whl (133 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.6/133.6 KB 219.4 MB/s eta 0:00:00
Collecting entrypoints>=0.2.2
  Downloading entrypoints-0.4-py3-none-any.whl (5.3 kB)
Collecting mistune<2,>=0.8.1
  Downloading mistune-0.8.4-py2.py3-none-any.whl (16 kB)
Collecting pandocfilters>=1.4.1
  Downloading pandocfilters-1.5.0-py2.py3-none-any.whl (8.7 kB)
Collecting jupyterlab-pygments
  Downloading jupyterlab_pygments-0.1.2-py2.py3-none-any.whl (4.6 kB)
Collecting testpath
  Downloading testpath-0.6.0-py3-none-any.whl (83 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 83.9/83.9 KB 295.2 MB/s eta 0:00:00
Collecting defusedxml
  Downloading defusedxml-0.7.1-py2.py3-none-any.whl (25 kB)
Collecting jupyter-core
  Downloading jupyter_core-4.9.2-py3-none-any.whl (86 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 86.9/86.9 KB 298.5 MB/s eta 0:00:00
Collecting bleach
  Downloading bleach-4.1.0-py2.py3-none-any.whl (157 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 157.9/157.9 KB 78.5 MB/s eta 0:00:00
Collecting jsonschema!=2.5.0,>=2.4
  Downloading jsonschema-4.4.0-py3-none-any.whl (72 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 72.7/72.7 KB 282.1 MB/s eta 0:00:00
Collecting pytz>=2017.3
  Downloading pytz-2021.3-py2.py3-none-any.whl (503 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 503.5/503.5 KB 46.5 MB/s eta 0:00:00
Collecting cfgv>=2.0.0
  Downloading cfgv-3.3.1-py2.py3-none-any.whl (7.3 kB)
Collecting identify>=1.0.0
  Downloading identify-2.4.11-py2.py3-none-any.whl (98 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 98.5/98.5 KB 214.4 MB/s eta 0:00:00
Collecting virtualenv>=20.0.8
  Downloading virtualenv-20.13.2-py2.py3-none-any.whl (8.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.7/8.7 MB 47.6 MB/s eta 0:00:00
Collecting nodeenv>=0.11.1
  Downloading nodeenv-1.6.0-py2.py3-none-any.whl (21 kB)
Collecting py>=1.5.0
  Downloading py-1.11.0-py2.py3-none-any.whl (98 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 98.7/98.7 KB 238.9 MB/s eta 0:00:00
Collecting packaging
  Downloading packaging-21.3-py3-none-any.whl (40 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 40.8/40.8 KB 245.5 MB/s eta 0:00:00
Collecting wcwidth
  Downloading wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting pluggy<1.0,>=0.12
  Downloading pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting more-itertools>=4.0.0
  Downloading more_itertools-8.12.0-py3-none-any.whl (54 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 54.3/54.3 KB 78.1 MB/s eta 0:00:00
Collecting pytest-cov~=2.8
  Downloading pytest_cov-2.12.0-py2.py3-none-any.whl (20 kB)
Collecting coverage[toml]>=5.2.1
  Downloading coverage-6.3.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (211 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 211.1/211.1 KB 137.0 MB/s eta 0:00:00
Collecting sphinxcontrib-applehelp
  Downloading sphinxcontrib_applehelp-1.0.2-py2.py3-none-any.whl (121 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 121.2/121.2 KB 311.6 MB/s eta 0:00:00
Collecting sphinxcontrib-qthelp
  Downloading sphinxcontrib_qthelp-1.0.3-py2.py3-none-any.whl (90 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 90.6/90.6 KB 215.0 MB/s eta 0:00:00
Collecting babel>=1.3
  Downloading Babel-2.9.1-py2.py3-none-any.whl (8.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.8/8.8 MB 49.8 MB/s eta 0:00:00
Collecting sphinxcontrib-serializinghtml>=1.1.5
  Downloading sphinxcontrib_serializinghtml-1.1.5-py2.py3-none-any.whl (94 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 94.0/94.0 KB 250.4 MB/s eta 0:00:00
Collecting imagesize
  Downloading imagesize-1.3.0-py2.py3-none-any.whl (5.2 kB)
Collecting sphinxcontrib-jsmath
  Downloading sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl (5.1 kB)
Collecting sphinxcontrib-devhelp
  Downloading sphinxcontrib_devhelp-1.0.2-py2.py3-none-any.whl (84 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 84.7/84.7 KB 250.1 MB/s eta 0:00:00
Collecting requests>=2.5.0
  Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 63.1/63.1 KB 189.0 MB/s eta 0:00:00
Collecting snowballstemmer>=1.1
  Downloading snowballstemmer-2.2.0-py2.py3-none-any.whl (93 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 93.0/93.0 KB 215.1 MB/s eta 0:00:00
Collecting sphinxcontrib-htmlhelp>=2.0.0
  Downloading sphinxcontrib_htmlhelp-2.0.0-py2.py3-none-any.whl (100 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100.5/100.5 KB 298.7 MB/s eta 0:00:00
Collecting docutils<0.18,>=0.15
  Downloading docutils-0.16-py2.py3-none-any.whl (548 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 548.2/548.2 KB 54.8 MB/s eta 0:00:00
Collecting pydata-sphinx-theme~=0.7.2
  Downloading pydata_sphinx_theme-0.7.2-py3-none-any.whl (1.4 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.4/1.4 MB 55.0 MB/s eta 0:00:00
Collecting beautifulsoup4<5,>=4.6.1
  Downloading beautifulsoup4-4.10.0-py3-none-any.whl (97 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 97.4/97.4 KB 262.3 MB/s eta 0:00:00
Requirement already satisfied: wheel in /home/ross/.virtualenvs/bar/lib/python3.10/site-packages (from sphinx-togglebutton~=0.3.0->myst-nb==0.13.2) (0.37.1)
Collecting toolz
  Downloading toolz-0.11.2-py3-none-any.whl (55 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 55.8/55.8 KB 276.9 MB/s eta 0:00:00
Collecting typing-extensions>=3.10.0
  Downloading typing_extensions-4.1.1-py3-none-any.whl (26 kB)
Collecting zipp>=0.5
  Downloading zipp-3.7.0-py3-none-any.whl (5.3 kB)
Collecting six
  Downloading six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting tenacity>=6.2.0
  Downloading tenacity-8.0.1-py3-none-any.whl (24 kB)
Collecting pytest-datadir>=1.2.0
  Downloading pytest_datadir-1.3.1-py2.py3-none-any.whl (5.9 kB)
Collecting pybtex>=0.20
  Downloading pybtex-0.24.0-py2.py3-none-any.whl (561 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 561.4/561.4 KB 75.3 MB/s eta 0:00:00
Collecting pybtex-docutils>=1.0.0
  Downloading pybtex_docutils-1.0.1-py3-none-any.whl (4.8 kB)
Collecting mpmath>=0.19
  Downloading mpmath-1.2.1-py3-none-any.whl (532 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 532.6/532.6 KB 100.6 MB/s eta 0:00:00
Collecting soupsieve>1.2
  Downloading soupsieve-2.3.1-py3-none-any.whl (37 kB)
Collecting coverage[toml]>=5.2.1
  Downloading coverage-6.3.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (210 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 210.6/210.6 KB 329.9 MB/s eta 0:00:00
  Downloading coverage-6.3-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (210 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 210.4/210.4 KB 44.0 MB/s eta 0:00:00
  Downloading coverage-6.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (215 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 215.5/215.5 KB 28.5 MB/s eta 0:00:00
  Downloading coverage-6.1.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (215 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 215.1/215.1 KB 117.0 MB/s eta 0:00:00
  Downloading coverage-6.1.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (215 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 215.1/215.1 KB 93.5 MB/s eta 0:00:00
  Downloading coverage-6.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (214 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 214.9/214.9 KB 78.0 MB/s eta 0:00:00
  Downloading coverage-6.0.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (255 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 255.1/255.1 KB 150.3 MB/s eta 0:00:00
  Downloading coverage-6.0.1-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (254 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 254.7/254.7 KB 25.7 MB/s eta 0:00:00
  Downloading coverage-6.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (254 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 254.4/254.4 KB 146.6 MB/s eta 0:00:00
  Downloading coverage-5.5-cp310-cp310-manylinux1_x86_64.whl (238 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 239.0/239.0 KB 145.1 MB/s eta 0:00:00
  Downloading coverage-5.4.tar.gz (687 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 687.7/687.7 KB 53.5 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
  Downloading coverage-5.3.1.tar.gz (684 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 684.5/684.5 KB 59.4 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
  Downloading coverage-5.3.tar.gz (693 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 693.6/693.6 KB 47.8 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
  Downloading coverage-5.2.1.tar.gz (694 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 694.1/694.1 KB 71.0 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... done
INFO: pip is looking at multiple versions of cfgv to determine which version is compatible with other requirements. This could take a while.
Collecting cfgv>=2.0.0
  Downloading cfgv-3.3.0-py2.py3-none-any.whl (7.3 kB)
INFO: pip is looking at multiple versions of beautifulsoup4 to determine which version is compatible with other requirements. This could take a while.
Collecting beautifulsoup4<5,>=4.6.1
  Downloading beautifulsoup4-4.9.3-py3-none-any.whl (115 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 115.8/115.8 KB 305.5 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of babel to determine which version is compatible with other requirements. This could take a while.
Collecting babel>=1.3
  Downloading Babel-2.9.0-py2.py3-none-any.whl (8.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.8/8.8 MB 51.2 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of attrs to determine which version is compatible with other requirements. This could take a while.
Collecting attrs
  Downloading attrs-21.3.0-py2.py3-none-any.whl (61 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.9/61.9 KB 248.8 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of sympy to determine which version is compatible with other requirements. This could take a while.
Collecting sympy
  Downloading sympy-1.8-py3-none-any.whl (6.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.1/6.1 MB 49.0 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of sphinxcontrib-bibtex to determine which version is compatible with other requirements. This could take a while.
Collecting sphinxcontrib-bibtex
  Downloading sphinxcontrib_bibtex-2.4.0-py3-none-any.whl (38 kB)
INFO: pip is looking at multiple versions of sphinx-copybutton to determine which version is compatible with other requirements. This could take a while.
Collecting sphinx-copybutton
  Downloading sphinx_copybutton-0.4.0-py3-none-any.whl (12 kB)
INFO: pip is looking at multiple versions of pytest-regressions to determine which version is compatible with other requirements. This could take a while.
Collecting pytest-regressions
  Downloading pytest_regressions-2.3.0-py3-none-any.whl (22 kB)
INFO: pip is looking at multiple versions of plotly to determine which version is compatible with other requirements. This could take a while.
Collecting plotly
  Downloading plotly-5.5.0-py2.py3-none-any.whl (26.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 26.5/26.5 MB 53.7 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of myst-nb to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of importlib-metadata to determine which version is compatible with other requirements. This could take a while.
Collecting importlib-metadata
  Downloading importlib_metadata-4.11.1-py3-none-any.whl (17 kB)
INFO: pip is looking at multiple versions of bokeh to determine which version is compatible with other requirements. This could take a while.
Collecting bokeh
  Downloading bokeh-2.4.1-py3-none-any.whl (18.5 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 18.5/18.5 MB 51.0 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of altair to determine which version is compatible with other requirements. This could take a while.
Collecting altair
  Downloading altair-4.1.0-py3-none-any.whl (727 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 727.8/727.8 KB 44.2 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of sphinx-togglebutton to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of sphinx-panels to determine which version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of docutils to determine which version is compatible with other requirements. This could take a while.
Collecting docutils<0.18,>=0.15
  Downloading docutils-0.15.2-py3-none-any.whl (547 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 547.6/547.6 KB 50.9 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of sphinx-book-theme to determine which version is compatible with other requirements. This could take a while.
Collecting sphinx-book-theme~=0.1.0
  Downloading sphinx_book_theme-0.1.9-py3-none-any.whl (94 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 94.5/94.5 KB 293.7 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of alabaster to determine which version is compatible with other requirements. This could take a while.
Collecting alabaster
  Downloading alabaster-0.7.11-py2.py3-none-any.whl (14 kB)
INFO: pip is looking at multiple versions of sphinx to determine which version is compatible with other requirements. This could take a while.
Collecting sphinx<5,>=3.1
  Downloading Sphinx-4.3.2-py3-none-any.whl (3.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.1/3.1 MB 51.2 MB/s eta 0:00:00
INFO: pip is looking at multiple versions of pyyaml to determine which version is compatible with other requirements. This could take a while.
Collecting pyyaml
  Downloading PyYAML-5.4.1.tar.gz (175 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 175.1/175.1 KB 76.5 MB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
INFO: pip is looking at multiple versions of pytest-cov to determine which version is compatible with other requirements. This could take a while.
Collecting pytest-cov~=2.8
  Downloading pytest_cov-2.11.1-py2.py3-none-any.whl (20 kB)
  Downloading pytest_cov-2.11.0-py2.py3-none-any.whl (20 kB)
  Downloading pytest_cov-2.10.1-py2.py3-none-any.whl (19 kB)
Collecting parso<0.9.0,>=0.8.0
  Downloading parso-0.8.3-py2.py3-none-any.whl (100 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100.8/100.8 KB 60.0 MB/s eta 0:00:00
Collecting MarkupSafe>=2.0
  Downloading MarkupSafe-2.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 kB)
Collecting pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0
  Downloading pyrsistent-0.18.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (115 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 115.8/115.8 KB 303.6 MB/s eta 0:00:00
Collecting nest-asyncio
  Downloading nest_asyncio-1.5.4-py3-none-any.whl (5.1 kB)
Collecting pyzmq>=13
  Downloading pyzmq-22.3.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.1/1.1 MB 48.9 MB/s eta 0:00:00
Collecting ptyprocess>=0.5
  Downloading ptyprocess-0.7.0-py2.py3-none-any.whl (13 kB)
Collecting latexcodec>=1.0.4
  Downloading latexcodec-2.0.1-py2.py3-none-any.whl (18 kB)
Collecting urllib3<1.27,>=1.21.1
  Downloading urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 138.7/138.7 KB 212.4 MB/s eta 0:00:00
Collecting certifi>=2017.4.17
  Downloading certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 149.2/149.2 KB 207.5 MB/s eta 0:00:00
Collecting charset-normalizer~=2.0.0
  Downloading charset_normalizer-2.0.12-py3-none-any.whl (39 kB)
Collecting idna<4,>=2.5
  Downloading idna-3.3-py3-none-any.whl (61 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.2/61.2 KB 243.2 MB/s eta 0:00:00
Collecting greenlet!=0.4.17
  Downloading greenlet-1.1.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (155 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 155.4/155.4 KB 275.7 MB/s eta 0:00:00
Collecting distlib<1,>=0.3.1
  Downloading distlib-0.3.4-py2.py3-none-any.whl (461 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 461.2/461.2 KB 45.8 MB/s eta 0:00:00
Collecting filelock<4,>=3.2
  Downloading filelock-3.6.0-py3-none-any.whl (10.0 kB)
Collecting platformdirs<3,>=2
  Downloading platformdirs-2.5.1-py3-none-any.whl (14 kB)
Collecting notebook>=4.4.1
  Downloading notebook-6.4.8-py3-none-any.whl (9.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.9/9.9 MB 48.6 MB/s eta 0:00:00
Collecting webencodings
  Downloading webencodings-0.5.1-py2.py3-none-any.whl (11 kB)
Collecting jupyter-server-mathjax>=0.2.2
  Downloading jupyter_server_mathjax-0.2.5-py3-none-any.whl (3.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.1/3.1 MB 52.3 MB/s eta 0:00:00
Collecting jupyter-server
  Downloading jupyter_server-1.13.5-py3-none-any.whl (397 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 397.5/397.5 KB 43.7 MB/s eta 0:00:00
Collecting colorama
  Downloading colorama-0.4.4-py2.py3-none-any.whl (16 kB)
Collecting GitPython!=2.1.4,!=2.1.5,!=2.1.6
  Downloading GitPython-3.1.27-py3-none-any.whl (181 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 181.2/181.2 KB 92.2 MB/s eta 0:00:00
Collecting gitdb<5,>=4.0.1
  Downloading gitdb-4.0.9-py3-none-any.whl (63 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 63.1/63.1 KB 272.4 MB/s eta 0:00:00
Collecting argon2-cffi
  Downloading argon2_cffi-21.3.0-py3-none-any.whl (14 kB)
Collecting terminado>=0.8.3
  Downloading terminado-0.13.2-py3-none-any.whl (14 kB)
Collecting websocket-client
  Downloading websocket_client-1.3.1-py3-none-any.whl (54 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 54.2/54.2 KB 258.7 MB/s eta 0:00:00
Collecting Send2Trash
  Downloading Send2Trash-1.8.0-py3-none-any.whl (18 kB)
Collecting anyio<4,>=3.1.0
  Downloading anyio-3.5.0-py3-none-any.whl (79 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 79.2/79.2 KB 297.6 MB/s eta 0:00:00
Collecting prometheus-client
  Downloading prometheus_client-0.13.1-py3-none-any.whl (57 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.1/57.1 KB 249.3 MB/s eta 0:00:00
Collecting sniffio>=1.1
  Downloading sniffio-1.2.0-py3-none-any.whl (10 kB)
Collecting smmap<6,>=3.0.1
  Downloading smmap-5.0.0-py3-none-any.whl (24 kB)
Collecting argon2-cffi-bindings
  Downloading argon2_cffi_bindings-21.2.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (86 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 86.2/86.2 KB 89.6 MB/s eta 0:00:00
Collecting cffi>=1.0.1
  Downloading cffi-1.15.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (446 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 446.3/446.3 KB 55.4 MB/s eta 0:00:00
Collecting pycparser
  Downloading pycparser-2.21-py2.py3-none-any.whl (118 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 118.7/118.7 KB 165.7 MB/s eta 0:00:00
Building wheels for collected packages: coverage, matplotlib, cPyparsing, tornado
  Building wheel for coverage (setup.py) ... done
  Created wheel for coverage: filename=coverage-4.5.4-cp310-cp310-linux_x86_64.whl size=206348 sha256=b57945a284d41cbfd6796bf86409d9da683915f5207f877be1471a0529135de0
  Stored in directory: /tmp/pip-ephem-wheel-cache-vwfuv5uc/wheels/24/3b/59/8b5b481efbac60694af4d7d281e8b27461384fcbd51ae53c8e
  Building wheel for matplotlib (setup.py) ... done
  Created wheel for matplotlib: filename=matplotlib-3.3.4-cp310-cp310-linux_x86_64.whl size=11960987 sha256=0b63c09e7afe329ed5963dd1bfb9fb7f914d3ba3f1368a8929e388285cee8443
  Stored in directory: /tmp/pip-ephem-wheel-cache-vwfuv5uc/wheels/38/c6/49/eaba6d234887d98d9c85185e2a90bd7bb77934e85eefaf317e
  Building wheel for cPyparsing (setup.py) ... done
  Created wheel for cPyparsing: filename=cPyparsing-2.4.5.0.1.1-cp310-cp310-linux_x86_64.whl size=4822321 sha256=b08c4b0b70545eaf09266a6b6ac4e10041682037a384f8faee6665c847828208
  Stored in directory: /tmp/pip-ephem-wheel-cache-vwfuv5uc/wheels/d4/73/da/888c849d3575989951a6070a90ceab6d6bb6417b3d2b17652f
  Building wheel for tornado (setup.py) ... done
  Created wheel for tornado: filename=tornado-6.1-cp310-cp310-linux_x86_64.whl size=421973 sha256=06e3ca0309f76cbde85810bfc19018bbdca6f1ce265e1a458e74437b50b600a8
  Stored in directory: /tmp/pip-ephem-wheel-cache-vwfuv5uc/wheels/80/32/8d/21cf0fa6ee4e083f6530e5b83dfdfa9489a3890d320803f4c7
Successfully built coverage matplotlib cPyparsing tornado
Installing collected packages: webencodings, wcwidth, snowballstemmer, Send2Trash, pytz, ptyprocess, pickleshare, nodeenv, mpmath, mistune, ipython-genutils, distlib, cPyparsing, certifi, backcall, alabaster, zipp, websocket-client, urllib3, typing-extensions, traitlets, tornado, toolz, toml, testpath, tenacity, sympy, sphinxcontrib-serializinghtml, sphinxcontrib-qthelp, sphinxcontrib-jsmath, sphinxcontrib-htmlhelp, sphinxcontrib-devhelp, sphinxcontrib-applehelp, soupsieve, sniffio, smmap, six, pyzmq, pyyaml, pyrsistent, pyparsing, pygments, pycparser, py, prompt-toolkit, prometheus-client, pluggy, platformdirs, pillow, pexpect, parso, pandocfilters, numpy, nest-asyncio, more-itertools, MarkupSafe, kiwisolver, jupyterlab-widgets, imagesize, idna, identify, greenlet, filelock, entrypoints, docutils, defusedxml, decorator, cycler, coverage, colorama, charset-normalizer, cfgv, babel, attrs, virtualenv, terminado, sqlalchemy, requests, python-dateutil, plotly, packaging, matplotlib-inline, markdown-it-py, latexcodec, jupyterlab-pygments, jupyter-core, jsonschema, jinja2, jedi, importlib-metadata, gitdb, coconut, cffi, beautifulsoup4, anyio, sphinx, pytest, pybtex, pre-commit, pandas, nbformat, mdit-py-plugins, matplotlib, jupyter-client, ipython, GitPython, bokeh, bleach, argon2-cffi-bindings, sphinx-togglebutton, sphinx-panels, sphinx-copybutton, pytest-datadir, pytest-cov, pydata-sphinx-theme, pybtex-docutils, nbclient, myst-parser, jupytext, ipykernel, argon2-cffi, altair, sphinxcontrib-bibtex, sphinx-book-theme, pytest-regressions, nbconvert, notebook, jupyter-server, widgetsnbextension, jupyter-server-mathjax, nbdime, ipywidgets, jupyter-sphinx, jupyter-cache, myst-nb
  Running setup.py develop for myst-nb
Successfully installed GitPython-3.1.27 MarkupSafe-2.1.0 Send2Trash-1.8.0 alabaster-0.7.12 altair-4.2.0 anyio-3.5.0 argon2-cffi-21.3.0 argon2-cffi-bindings-21.2.0 attrs-21.4.0 babel-2.9.1 backcall-0.2.0 beautifulsoup4-4.10.0 bleach-4.1.0 bokeh-2.4.2 cPyparsing-2.4.5.0.1.1 certifi-2021.10.8 cffi-1.15.0 cfgv-3.3.1 charset-normalizer-2.0.12 coconut-1.4.3 colorama-0.4.4 coverage-4.5.4 cycler-0.11.0 decorator-5.1.1 defusedxml-0.7.1 distlib-0.3.4 docutils-0.16 entrypoints-0.4 filelock-3.6.0 gitdb-4.0.9 greenlet-1.1.2 identify-2.4.11 idna-3.3 imagesize-1.3.0 importlib-metadata-4.11.2 ipykernel-5.5.6 ipython-7.32.0 ipython-genutils-0.2.0 ipywidgets-7.6.5 jedi-0.18.1 jinja2-3.0.3 jsonschema-4.4.0 jupyter-cache-0.4.3 jupyter-client-7.1.2 jupyter-core-4.9.2 jupyter-server-1.13.5 jupyter-server-mathjax-0.2.5 jupyter-sphinx-0.3.2 jupyterlab-pygments-0.1.2 jupyterlab-widgets-1.0.2 jupytext-1.11.5 kiwisolver-1.3.2 latexcodec-2.0.1 markdown-it-py-1.1.0 matplotlib-3.3.4 matplotlib-inline-0.1.3 mdit-py-plugins-0.2.8 mistune-0.8.4 more-itertools-8.12.0 mpmath-1.2.1 myst-nb-0.13.2 myst-parser-0.15.2 nbclient-0.5.11 nbconvert-6.4.2 nbdime-3.1.1 nbformat-5.1.3 nest-asyncio-1.5.4 nodeenv-1.6.0 notebook-6.4.8 numpy-1.22.2 packaging-21.3 pandas-1.3.5 pandocfilters-1.5.0 parso-0.8.3 pexpect-4.8.0 pickleshare-0.7.5 pillow-9.0.1 platformdirs-2.5.1 plotly-5.6.0 pluggy-0.13.1 pre-commit-2.17.0 prometheus-client-0.13.1 prompt-toolkit-3.0.28 ptyprocess-0.7.0 py-1.11.0 pybtex-0.24.0 pybtex-docutils-1.0.1 pycparser-2.21 pydata-sphinx-theme-0.7.2 pygments-2.11.2 pyparsing-3.0.7 pyrsistent-0.18.1 pytest-5.4.3 pytest-cov-2.10.1 pytest-datadir-1.3.1 pytest-regressions-2.3.1 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 pyzmq-22.3.0 requests-2.27.1 six-1.16.0 smmap-5.0.0 sniffio-1.2.0 snowballstemmer-2.2.0 soupsieve-2.3.1 sphinx-4.4.0 sphinx-book-theme-0.1.10 sphinx-copybutton-0.5.0 sphinx-panels-0.4.1 sphinx-togglebutton-0.3.0 sphinxcontrib-applehelp-1.0.2 sphinxcontrib-bibtex-2.4.1 sphinxcontrib-devhelp-1.0.2 sphinxcontrib-htmlhelp-2.0.0 sphinxcontrib-jsmath-1.0.1 sphinxcontrib-qthelp-1.0.3 sphinxcontrib-serializinghtml-1.1.5 sqlalchemy-1.4.31 sympy-1.9 tenacity-8.0.1 terminado-0.13.2 testpath-0.6.0 toml-0.10.2 toolz-0.11.2 tornado-6.1 traitlets-5.1.1 typing-extensions-4.1.1 urllib3-1.26.8 virtualenv-20.13.2 wcwidth-0.2.5 webencodings-0.5.1 websocket-client-1.3.1 widgetsnbextension-3.5.2 zipp-3.7.0

AFAICT the dependency resolver is working exactly as intended, but as a consequence of the strict pinning, I end up getting versions of some libraries (e.g. matplotlib==3.3.4) that don't have wheels provided for Python 3.10, so they have to be built from source, which takes several minutes. Even after everything installs successfully, the resulting environment doesn't work - dev tasks like running unit tests and building docs all fail:

Output of `$ pytest`
Traceback (most recent call last):
  File "/home/ross/repos/MyST-NB/mystnb-dev/bin/pytest", line 8, in 
    sys.exit(main())
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/_pytest/config/__init__.py", line 105, in main
    config = _prepareconfig(args, plugins)
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/_pytest/config/__init__.py", line 257, in _prepareconfig
    return pluginmanager.hook.pytest_cmdline_parse(
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/pluggy/hooks.py", line 286, in __call__
    return self._hookexec(self, self.get_hookimpls(), kwargs)
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/pluggy/manager.py", line 93, in _hookexec
    return self._inner_hookexec(hook, methods, kwargs)
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/pluggy/manager.py", line 84, in 
    self._inner_hookexec = lambda hook, methods, kwargs: hook.multicall(
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/pluggy/callers.py", line 203, in _multicall
    gen.send(outcome)
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/_pytest/helpconfig.py", line 90, in pytest_cmdline_parse
    config = outcome.get_result()
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/pluggy/callers.py", line 80, in get_result
    raise ex[1].with_traceback(ex[2])
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/pluggy/callers.py", line 187, in _multicall
    res = hook_impl.function(*args)
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/_pytest/config/__init__.py", line 836, in pytest_cmdline_parse
    self.parse(args)
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/_pytest/config/__init__.py", line 1044, in parse
    self._preparse(args, addopts=addopts)
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/_pytest/config/__init__.py", line 992, in _preparse
    self.pluginmanager.load_setuptools_entrypoints("pytest11")
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/pluggy/manager.py", line 299, in load_setuptools_entrypoints
    plugin = ep.load()
  File "/usr/lib/python3.10/importlib/metadata/__init__.py", line 162, in load
    module = import_module(match.group('module'))
  File "/usr/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "", line 1050, in _gcd_import
  File "", line 1027, in _find_and_load
  File "", line 992, in _find_and_load_unlocked
  File "", line 241, in _call_with_frames_removed
  File "", line 1050, in _gcd_import
  File "", line 1027, in _find_and_load
  File "", line 1006, in _find_and_load_unlocked
  File "", line 688, in _load_unlocked
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/_pytest/assertion/rewrite.py", line 143, in exec_module
    source_stat, co = _rewrite_test(fn, self.config)
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/_pytest/assertion/rewrite.py", line 330, in _rewrite_test
    co = compile(tree, fn, "exec", dont_inherit=True)
TypeError: required field "lineno" missing from alias
output of `$ cd docs && make html`
$ cd docs
$ make html
Running Sphinx v4.4.0
making output directory... done
Traceback (most recent call last):
  File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/coconut/_pyparsing.py", line 45, in 
    import cPyparsing as _pyparsing
ImportError: /home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/cPyparsing.cpython-310-x86_64-linux-gnu.so: undefined symbol: _PyGen_Send

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/coconut/_pyparsing.py", line 60, in
from pyparsing import ( # NOQA
ImportError: cannot import name '_trim_arity' from 'pyparsing' (/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/pyparsing/init.py)
Traceback (most recent call last):
File "/home/ross/repos/MyST-NB/mystnb-dev/bin/coconut", line 5, in
from coconut.main import main
File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/coconut/main.py", line 35, in
from coconut.command import Command
File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/coconut/command/init.py", line 20, in
from coconut.command.command import * # NOQA
File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/coconut/command/command.py", line 30, in
from coconut._pyparsing import PYPARSING_INFO
File "/home/ross/repos/MyST-NB/mystnb-dev/lib/python3.10/site-packages/coconut/_pyparsing.py", line 84, in
raise ImportError(
ImportError: Coconut requires pyparsing/cPyparsing version >= 2.4.5 (run 'pip install --upgrade cPyparsing' to fix)

Exception occurred:
File "/usr/lib/python3.10/subprocess.py", line 369, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['coconut', '--jupyter']' returned non-zero exit status 1.
The full traceback has been saved in /tmp/sphinx-err-f4wz7v4r.log, if you want to report the issue to the developers.
Please also report this if it was a user error, so that a better error message can be provided next time.
A bug report can be filed in the tracker at https://github.com/sphinx-doc/sphinx/issues. Thanks!
make: *** [Makefile:20: html] Error 2

Note that despite the project pinning, the doc build fails with a dependency error at import time. The specifics of the above are better suited for a myst-nb issue, but I bring it up here to illustrate a larger point. If instead I go to the setup.cfg and just install all of the dependencies listed there without the pins, then I get closer to a working environment - i.e. the docs build works out of the box and the test suite runs (though with errors that would be fixed with pinned dependencies in ipython and pandas). I don't have any expertise in dependency management nor packaging best-practices, but to my untrained eyes it seems that the agressive dependency pinning is actually more of a detriment in this case.

I realize the "setting-up-a-dev-environment" case is not relevant for the primary use-case discussed here, i.e. users installing EBP tools for publishing their own work. However, users are not likely to be using EBP in a vacuum - they will be installing it into environments that have existing dependency constraints, where some of these observations are likely to be relevant.

Anyway, I just wanted to share a bit more of my experience working with EBP tools, as I've definitely experienced more dependency issues under the EBP umbrella than e.g. other scientific Python libraries. It's definitely a complex issue and it could be that there are major flaws in my own workflow.

@chrisjsewell
Copy link
Member

chrisjsewell commented Mar 3, 2022

I end up getting versions of some libraries (e.g. matplotlib==3.3.4) that don't have wheels provided for Python 3.10
though with errors that would be fixed with pinned dependencies in ipython and pandas

Heya, @rossbar myst-nb does not currently support python 3.10, not in the setup.cfg classifiers, nor in the CI testing.
It is tested against python 3.6 to 3.9, on linux, mac osx and windows: https://github.com/executablebooks/MyST-NB/blob/52f7fb7f4b36ad5ac12a6625a78403b8b2dfd36b/.github/workflows/tests.yml#L25-L37

should the python version be pinned to <3.10? not according to this discussion 😬

matplotlib is also specifically pinned because the tests were failing with 3.4.0: https://github.com/executablebooks/MyST-NB/blob/52f7fb7f4b36ad5ac12a6625a78403b8b2dfd36b/setup.cfg#L94-L95

I realize the "setting-up-a-dev-environment" case is not relevant for the primary use-case discussed here

So yes, I would say setting up a dev environment is a very different discussion: basically we only guarantee it works for the environments run on GH Actions: https://github.com/executablebooks/MyST-NB/actions/runs/1826013747

@chrisjsewell
Copy link
Member

jupyter-book had the dependency jupytext~=1.11.3 but I wanted to install jupytext==1.12.0-dev to test against.

FYI, jupytext is no longer a dependency of jupyter-book (or any of the EBP stack): jupyter-book/jupyter-book#1645 (comment)

@bryanwweber
Copy link

jupyter-book had the dependency jupytext~=1.11.3 but I wanted to install jupytext==1.12.0-dev to test against.

FYI, jupytext is no longer a dependency of jupyter-book (or any of the EBP stack): executablebooks/jupyter-book#1645 (comment)

That's fine, I think the general point still stands...

should the python version be pinned to <3.10? not according to this discussion 😬

No, it should not be, unless there is a specific known issue with Python 3.10. Because in all likelihood, it will "just work". The problem here is that other package versions are pinned. If the other versions were unpinned, then this wouldn't be a problem, and you wouldn't have to also add the Python 3.10 pin.

As far as I can tell, everyone who's commented on this issue is against upper version pinning (with the possible exception of @choldgraf who didn't express a strong opinion one way or the other). I also recognize that most of the people who've commented are essentially power users, so may not represent the "average" user; however, I would submit that at this point in the EBP hype-cycle, the average user is probably a power user.

At this point, I've said my piece... I'll be unsubscribing from this issue. If you want any help with implementing or replying to issues, please open a new issue and tag me (or hit me on Twitter, @darthbith) 😄

@choldgraf
Copy link
Member Author

choldgraf commented Mar 3, 2022

I'll just note that I'm trying not to express strong opinions because I want to listen to what others are saying here, and don't want to lead the discussion too much. I do agree that right now the overwhelming signal we seem to be getting is that version pinning is causing problems for people (with the caveat that the users here are the "power users" of this stack, though I agree that this is a particularly important class of users to keep happy).

@chrisjsewell has a point that we aren't seeing complaints about upstream updates breaking things precisely because we are pinning upper versions. But I also don't think that this should just trump any and all feedback that people are giving about version pinning problematic.

Compromise?

Can we try to strike a compromise here between "pin everything" and "pin nothing"? E.g., going back to the original proposal, what if we added another step which was adding "canary" CI/CD jobs for our dependencies. So:

  1. Pin docutils to a minor version, and sphinx to a major version cap, because these tend to break things with releases.
  2. Run a CI/CD job on latest for each of our dependencies, so that we can get ahead of problems before they arise. (we could do this with Python 3.10 as well)/ If we don't have time to apply a fix before they released, we could apply a temporary version cap until it is fixed.
  3. If we know a dependency will have a breaking change w/ a new release, or lose confidence that it is stable in general, then start capping its version it until we're confident that it will no longer break things.

To me, this feels like it would:

  1. Reduce our risk associated with unexpected downstream changes, because we'd catch them ahead of time in the latest/ CI/CD builds.
  2. Reduce the pain we inflict on users because of aggressive version pinning, because it would only pin a subset of our dependencies - specifically, the ones that we know you want to pin.

To me the main downside with it is that it would add an extra layer of complexity to our CI/CD jobs, but looking across the Python ecosystem it seems that these canary deployments are pretty common practice, so I can't imagine it's causing too much extra pain for those communities (somebody correct me if I'm wrong).

@chrisjsewell
Copy link
Member

Can we try to strike a compromise here between "pin everything" and "pin nothing"?

I would note we don't actually pin everything already.

I think it would be helpful to have some specific here.
Below are the dependency requirements from most of the core EBP packages.
What are people actually proposing to change?

jupyter-book (develop)

dependencies = [
    "click>=7.1,<9",
    "docutils>=0.15,<0.18",
    "jsonschema<4",
    "linkify-it-py~=1.0.1",
    "myst-nb~=0.13.1",
    "pyyaml",
    "sphinx>=3,<5",
    "sphinx-comments",
    "sphinx-copybutton",
    "sphinx-external-toc~=0.2.3",
    "sphinx-jupyterbook-latex~=0.4.6",
    "sphinx-panels~=0.6.0",
    "sphinx-thebe~=0.1.1",
    "sphinx_book_theme~=0.1.4",
    "sphinx_togglebutton",
    "sphinxcontrib-bibtex>=2.2.0,<=2.5.0",
    "sphinx-multitoc-numbering~=0.1.3",
]

myst-nb (after executablebooks/MyST-NB#380)

install_requires =
    docutils>=0.15,<0.18
    importlib_metadata
    ipython
    jupyter-cache~=0.5.0
    myst-parser~=0.17.0
    nbformat~=5.0
    pyyaml
    sphinx>=3.5,<5
    sphinx-togglebutton~=0.3.0
    typing-extensions

jupyter-cache (v0.5.0)

install_requires =
    attrs
    click
    importlib-metadata
    nbclient>=0.2,<0.6
    nbformat
    pyyaml
    sqlalchemy>=1.3.12,<1.5
    tabulate

myst-parser (v0.17.0)

install_requires =
    docutils>=0.15,<0.18
    jinja2
    markdown-it-py>=1.0.0,<3.0.0
    mdit-py-plugins~=0.3.0
    pyyaml
    sphinx>=3.1,<5
    typing-extensions

@choldgraf
Copy link
Member Author

choldgraf commented Mar 3, 2022

Good point - I guess the answer for the above proposal would be:

Unpin everything except for:

  • Docutils
  • Sphinx
  • Anything else that we know will break with the next significant release (major or minor depending on the tool)
  • (maybe any EBP projects that are central to the functionality of a tool?)

And in all projects (or at least, the "core stack" ones), start running a CI/CD job that tested all of these things against "latest".

So:

jupyter-book

dependencies = [
    "click>=7.1",
    "docutils>=0.15,<0.18",  # We might be able to stop pinning docutils explicitly , since Sphinx pins it now already in one of the 3.x releases, I believe
    "jsonschema",
    "linkify-it-py",
    "myst-nb~=0.13.1", 
    "pyyaml",
    "sphinx>=3,<5",
    "sphinx-comments",
    "sphinx-copybutton",
    "sphinx-external-toc",
    "sphinx-jupyterbook-latex",
    "sphinx-panels",
    "sphinx-thebe",
    "sphinx_book_theme~=0.1.4",
    "sphinx_togglebutton",  # We can probably get rid of this, since it's a dep of myst-nb
    "sphinxcontrib-bibtex>=2.2.0",
    "sphinx-multitoc-numbering",
]

or if we wanted to also not pin our own core tools (or rather, only pin them if we know that the next major release will break things):

dependencies = [
    "click>=7.1",
    "docutils>=0.15,<0.18",
    "jsonschema",
    "linkify-it-py",
    "myst-nb", 
    "pyyaml",
    "sphinx>=3,<5",
    "sphinx-comments",
    "sphinx-copybutton",
    "sphinx-external-toc",
    "sphinx-jupyterbook-latex",
    "sphinx-panels",
    "sphinx-thebe",
    "sphinx_book_theme",
    "sphinx_togglebutton",  # We can probably get rid of this, since it's a dep of myst-nb
    "sphinxcontrib-bibtex>=2.2.0",
    "sphinx-multitoc-numbering",
]

@choldgraf
Copy link
Member Author

choldgraf commented Mar 3, 2022

Also just a note that I think we have done a nice job of reducing our dependency exposure in general, so things like removing a dependency entirely on nbconvert will have a big difference. Looking at the list above, I also suspect only one or two dependencies would really be problematic from a pinning standpoint (e.g. nbformat, click, or other dependencies that many other projects would likely touch as well)

@chrisjsewell
Copy link
Member

this appears to be going against the original proposal?

We do cap:

  • Our own EBP tools (because we know that we follow semver and can coordinate releases)

@choldgraf
Copy link
Member Author

choldgraf commented Mar 3, 2022

Sorry - I was restricting the EBP capping to just the "core" projects that we know would break things if they introduced a breaking change. I don't feel super strongly about this though.

An example of this was the recent sphinx-togglebutton release. It overhauled the look and feel of the toggle button in a big way, and that's why I bumped the minor version, but it wasn't strictly a breaking change, so you could make a case for that being a patch release instead. Because we have it pinned, it will cause some unnecessary toil from us in explicitly bumping the version. I'd lean towards not pinning that version, unless we know that the next release will break things. But again I don't feel strongly on that one.

Compare that with myst-parser, which recently changed the default behavior of Dollarmath, and I think that's one where we do want to pin the version because otherwise it would have suddenly changed many people's books.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants