-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Goal: ensure lock file consistency after all installation commands #1255
Comments
Noting a potential implementation challenge here: I'm not sure that However, install-time conflict detection should be possible regardless, since pip is used to handle the actual installs. |
You don't need to wait until install time per se. You can do this when you resolve the full transitive dependencies for the lock file. You can mostly do this with See also pypa/pipfile#100. It is still a root issue that |
@taion We're not going to switch If we can detect any conflicts at lock time instead of at install time, that will be excellent, though. |
In practice from a user perspective that's just a special case of #966 and other related (closed) issues. Like worst case you just cache the resolved transitive dependencies in the lockfile. Yarn and npm already do that anyway. It'd let you update all your prod deps without hitting PyPI for the dev deps. Unless I'm missing something there's no active benefit of letting things drift out of sync. As someone who's spent a lot of time with these package management patterns, most of these deviations from how existing tools work just seem to cause pain. |
I'll also add that npm does in fact support updating only production dependencies and not development ones, and it does in fact use a flat lockfile (with cached resolutions). Interestingly enough, Yarn doesn't for its bulk "update dependencies" subcommand (though the interactive version does split prod and dev deps), and it doesn't look like anybody's complained about it, but that can probably be chalked up to the npm-alikes using caret ranges for dependencies by default, plus people generally being good enough about following SemVer to seldom cause users problems from bumping dev deps. The Python ecosystem, not so much. |
@taion An explicit If there's a symmetric solution that also allows both For the symmetric approach, I think one way of framing & implementing that would be to pursue the approach that @vphilippon described in #966 (comment), but structure it as follows:
(Note: for security management reasons, I really want to keep the default behaviour of |
The behavior you're describing for In other words you can't fully split out the "install" from the "lock". See e.g. https://github.com/taion/pipf/blob/1feee35a2e4480bc7e5b53bfab17587d37bdf9dd/pipf/pipfile.py#L150-L155 – when regenerating the lockfile, I explicitly pass in the packages that are getting installed to discard the old constraints for those. The complaint anyway isn't that something like an explicit Yarn's handling here is a pretty good example in terms of CLI. Yarn has I've never had to explicitly generate a lockfile outside of installing my updated dependencies, and I don't see why I would want to do so. It's not a typical action with a locking package manager, since almost any action you take to modify your installed dependencies will update the lockfile anyway. The set of user-facing interactions instead looks more like:
(1) and (2) should always apply minimal changes to my lockfile (or generate it if necessary), while (3) should essentially disregard the existing lockfile. But for just generating a lockfile, (1) is sufficient (since it's going to be run from a dev environment where everything is installed anyway). P.S. Yarn splits out |
The first few paragraphs in https://lwn.net/Articles/711906/ do a pretty good job of describing my mindset here: "moving target" should be the default security model for all projects, and opting out of that and into the "hardened bunker" approach should be a deliberate design decision taken only after considering the trade-offs. Hence the choice of option name as well: For the specific case of
Thus the idea of ensuring consistency by always using the lock file to drive the installation (unless |
I think we're in agreement on the substantive points on the desirability of upgrading dependencies. And, sure, I am saying, though, in a normal workflow, users have no reason to run
All the above actions keep Or, to put it another way, I've never had to run (And the reason it's not so useful to just generate a lockfile with updated dependencies is because the very next step in a dev workflow is to reinstall the updated dependencies, then run the test suite.) |
FWIW, I run a tool which creates dependency update PRs automatically (Dependabot), including for Pipfiles, and I would love a In my experience, PRs to update a single dependency are much more likely to be merged than ones that update many at once. Generating them automatically helps ensure the moving target keeps moving. |
Right, and PR generation is where I think |
In that vein, if While |
The equivalent for the Greenkeeper/&c. use case with npm is just the following, right?
In fact does npm5 even expose a way to just create a Greenkeeper-like services are very cool and worth using, but it'd be odd to consider that a core use case (and as noted above, even that case of "update a single dependency and the lockfile, but don't install things" admits a cleaner expression). |
Oops, I hit enter too soon. Outside of Greenkeeper-style workflows, it's almost vanishingly uncommon to just bump my dependencies without syncing them locally. It only makes sense if you somehow don't have a local copy of the deps installed. And again PR-only services are an exception, but only that. The description here is "Python Development Workflow for Humans", and as such it's worth keeping in mind what actual development workflows look like. And we have a pretty good existence proof that explicitly running the "generate lockfile" operation is almost never necessary, so it'd be worthwhile to not think too much in terms of those. The specific issue here, BTW, is that Pipenv uses An upgrade operation could in principle switch a |
@taion - yep, agreed, not expecting you guys to keep us / other dependency update tools in mind too much, although very grateful if you do! I only mentioned it because I know I would have used For reference, |
@greysteil What do you do if someone has a |
I don't understand the working of Pipenv in anywhere near the detail that others on this thread do, and don't want to take it off-topic. If the below is useful, great. If it's not, sorry! We use some nasty hax:
End result is that the user gets a PR with their As I understand it,
So, that would be a bunch better. Ideal would be to be able to run |
Suppose my Pipfile has:
If I weren't using Pipenv, I'd just run:
If I were using pip-tools, I'd run:
Well, just doing some hypothetical But unless I'm misunderstanding, the strategy you describe above would only update the version of As such, for this case, which I think is a very common use case for humans upgrading dependencies, there should be some way to upgrade a single package and its transitive dependencies (if needed), but not anything else. The problem here specifically is the
And then on requesting an upgrade, bump that to:
In which case the strategy above of running something like a |
Sorry, I wasn't clear enough about step (iii) above. We lock all of the top-level dependencies to the version in the lockfile, but leave the sub-dependencies unlocked. |
@greysteil I see – so this can potentially upgrade transitive dependencies of other direct dependencies? |
Yep - theory is that if your top-level dependencies are specifying their sub-dependencies properly (i.e., pessimistically) then updating your sub-dependencies shouldn’t be dangerous. (I’d rather the behaviour was closer to Ruby’s Bundler, where only sub-dependencies of the dependency being updated can also be updated, but strong-arming pipenv to do that didn’t seem worth the hackery.) |
@taion By design,
Now, there are currently CLI design & implementation problems affecting both of those usage models - for interactive batch updates, there are ways for the lock file and the local environment to get out of sync, and for the automated selective updates, there are challenges in implementing the tools that do the selective updates, as well as in enabling the related interactive selective operations to add new dependencies and modify the constraints on existing ones. But that core design philosophy of "Default to running the latest version of everything unless explicitly told otherwise in |
Awesome to read that automated tools are a use case you're thinking about @ncoghlan. 🎉 If you ever want feedback on what Dependabot / others would like/need then let me know. Python's not my home language, so I've not been able to contribute to pip/pipenv yet, but I'd love to help in any way I can. |
@ncoghlan I am totally in agreement with you here, and I think the approach you described makes sense. Using a |
@techalchemy OK, I'll revise the initial post to cover the symmetric proposal (i.e. handling installation of new packages via That will still leave us with several open questions (like how |
OK, I've updated the initial post with some proposed substeps that will allow us to reach a state where it's genuinely difficult to get the lock file and the local environment out of sync. Of the proposed substeps, I think the |
Are the proposed changes still expected to fix the discrepancies between |
@matthijskooijman It is not explicitly stated in the top post, but #1255 (comment) (which is mentioned in the post as current) does state that there needs to either be a way to “detect (and resolve) inconsistencies” in |
The intent is for that inconsistency to get resolved at the (Note: this wasn't clear previously, so I've added a separate bullet point calling that step out) |
I just noticed that pipenv does not regenerate the lockfile when you change the underlying python implementation between CPython and PyPy.
|
@joshfriend That's actually intentional, but see #857 (comment) for some comments on why it's currently problematic. |
I think now that |
i just took a stab at this fdebdc3 it has implementation details for sure (sub deps that exist for a develop package but not a default package won't get removed, for example), but it's a definite improvement. |
fixed the implementation details. daa56e1 |
@kennethreitz Good point regarding With your implementation of the #1137 changes, that just leaves the Given the chosen implementation approach for #1486, the |
Working on |
|
Also added an additional [pipenv]
keep_outdated = true |
|
Huzzah! Closing this, as any further limitations that aren't already covered by #857 can be reported as new issues after 10.1 is released :) |
working on a bug, but will be resolved shortly |
fixed |
As noted in #1137,
pipenv
doesn't currently check for requirement conflicts between[packages]
and[dev-packages]
, which can cause problems when attempting to generate a flatrequirements.txt
that covers both, as well as causing general weirdness when dependencies conflict.The way
pipenv install
works (running the installation and then updating the lock file) means it is also relatively easy for the local environment to get out of sync with the lock file: if the installation request covers an already installed package, then it won't be updated locally, but the lock file will be updated to the latest version available from any configured package indices.This issue doesn't cover an individual feature request. Instead, it establishes the goal of helping to ensure consistency between the lock file and the local environment by structuring installation commands to modify the lock file first, and then use the updated lock file to drive the actual installation step. (We don't have any kind of target ETA for how long this work will take - writing it down is just intended to help ensure we're all heading in the same direction, and are all comfortable with that direction)
A key aspect of this is going to be clarifying the division of responsibilities between
pipenv install
,pipenv uninstall
,pipenv lock
, andpipenv update
(and potentially adding one or more new subcommands if deemed necessary).Proposed substeps:
pipenv sync
subcommand that ensures that the current environment matches the lock file (akin topip-sync
inpip-tools
). (Implemented by @kennethreitz for 10.0.0, withpipenv sync
ensuring versions of locked packages match the lock file, whilepipenv clean
removes packages that aren't in the lock file. I like that change relative topip-sync
, as it means that including an implicit sync in another command won't unexpectedly delete anything from the virtual environment)pipenv install
with no arguments, but switch to recommending the use ofpipenv sync
when setting up a fresh environment (that waypipenv install
is used mainly for changing the installed components)pipenv lock
to always ensure that[packages]
and[dev-packages]
are consistent with each other (resolution for Pipenv ignores conflicts between default and development dependencies #1137)pipenv update
to be equivalent topipenv lock && pipenv sync
(pipenv update is instead being removed entirely - it's old behaviour was actually comparable to what is nowpipenv sync && pipenv clean
)pipenv lock --keep-outdated
option that still generates a freshPipfile.lock
fromPipfile
, but minimises the changes made to only those needed to satisfy any changes made toPipfile
(whether that's package additions, removals, or changes to version constraints)pipenv install <packages>
to be equivalent to "add or update entries in Pipfile" followed bypipenv lock && pipenv sync
(implemented in Changepipenv install
to update the lock file first #1486).--keep-outdated
option to pipenv install that it passes through to thepipenv lock
operation.pipenv install --selective-upgrade <packages>
feature that's semantically equivalent to "remove those package entries from Pipfile (if present)", runpipenv lock --keep-outdated
, then runpipenv install --keep-outdated <packages>
(this is the final step that delivers support for Updating only one locked dependency #966). If just a package name is given, then the existing version constraint inPipfile
is used, otherwise the given version constraint overwrites the existing one. The effect of this on the current environment should be the same aspip install --upgrade --upgrade-strategy=only-if-needed <packages>
in pip 9+ (except thatPipfile.lock
will also be updated).If anyone wants to pick up one of these substeps, make a comment below to say you're working on it (to attempt to minimise duplication of effort), then file a PR linking back to this issue. (Steps listed earlier will need to be completed before later steps are practical)
Open questions:
Resolved questions (at least for the immediate future):
pipenv lock
will continue to default to upgrading everything by default, withpipenv lock --keep-outdated
to request a minimal update that only adjusts the lock file to account forPipfile
changes (additions, removals, and changes to version constraints)pipenv lock --skip-lock
will continue to work as it does today (even though it means the lock file and the local environment can get out of sync: usepipenv sync && pipenv clean
to resync them)pipenv install
andpipenv install <package>
will continue to imply a fullpipenv lock
by default, withpipenv install --keep-outdated
needed to request only the minimal changes required to satisfy the installation requestpipenv install <package>
will continue to retain the existing version constraint inPipfile
if none is given on the command line, even for the new--selective-upgrade
optionpipenv uninstall <package>
will just remove the specified package[s], and hence may leave no longer needed dependencies in the local environment. Runningpipenv lock && pipenv sync && pipenv clean
will clear them out.Note: the original proposal here was just to ensure that
[dev-packages]
and[packages]
were kept in sync when generating the lock file, and the first few posts reflect that. The current proposal instead covers the lock file driven symmetric update proposal first mentioned in #1255 (comment)The text was updated successfully, but these errors were encountered: